Skip to end of metadata
Go to start of metadata

This section describes the multiple performance graphs and the containing curves.

Note

Each graph has a Users curve which plots the current number of running VUs

Key Performance Indicators (KPI) 

CurveDescription
UsersThe number of instantiated and active VUs, iterating through their respective test cases. Some VUs can complete the test earlier than others and become inactive. VUs which completed all their iterations before the test ends are excluded from the active user count. If the test is configured to complete only after all VUs complete their iterations, then the User graph will show a gradual declining the number of VUs at the end of the test. This curve is present in every graph.
Requests/SecThe number of requests being sent per second. Calculated by taking the number of requests from the previous sample timestamp and dividing by the sample rate.
Avg. Response (s)

The average response time of the sent requests. This is the average value since the previous sample timestamp. The data point is skipped to reflect an accurate response time curve if no responses were received since the previous sample timestamp.

KB Received/SecThe number of kilobytes currently received per second. Calculated by taking the total payload downloaded from the previous sample timestamp and dividing it by the sample rate.
Errors/SecThe number of errors currently received per second. Calculated by taking the number of request errors from the previous sample timestamp and dividing it by the sample rate.
Pending RequestsThe number of currently pending requests. Pending requests are issued, but StresStimulus does not yet receive a response. Generally, a higher number of pending requests indicates a slower server response. This parameter can be used to gauge performance change in several test runs.
Total Transactions/SecThe total number of transactions completed per second. A completed transaction executes all child objects. If transactions are nested, then each transaction is counted separately toward the total completed transactions.

Transactions

This graph monitors the average response time of every transaction in the test. A single curve exists for every transaction. Each data point is calculated using the average times of completed transactions since the previous sample timestamp.  The data point is skipped if no transaction was completed since the previous sample timestamp. If the test has no transactions, then the respective graph will be hidden from the runtime dashboard and the report.

Collated transactions

If collated transactions are enabled, then there will be one curve for all transactions with the same name.


Additional grid columns

Column NameDescription
IterationsThe number of times this transaction was completed successfully
90%

The maximum point value after excluding the slowest 10%

95%The maximum point value after excluding the slowest 5%
99%The maximum point value after excluding the slowest 1%

Missed requirements

(Transactions only)

The number of transaction times greater than the Required performance time
Missed GoalsThe number of transaction times greater than the Goal time

Pages

The page graph and grid have the same functionality as the transactions graph described above.

Missed goal and Missed required performance

If the transaction time is greater than the Goal and Required performance, then only the Missed goal counter will be incremented. The Missed requirement counter will not be affected.

Test Cases

This graph monitors the average iteration time of every test case in the test. A single curve exists for every test case. Each data point is calculated by taking the average iteration times of completed iterations since the previous sample timestamp.  The data point is skipped if no iteration was completed since the previous sample timestamp.

Test Object Progress

A progress grid for test cases and test case groups

Column NameDescription
NameThe test case or test case group name (test object)
Users

The number of active VUs executing the test object

Iterations StartedThe number of iterations started by the test object
Iterations PassedThe number of iterations completed successfully by the test object
Iterations FailedThe number of iterations that failed by the test object
Requests SentThe number of issued requests by the VUs executing the test object
Responses ReceivedThe number of received responses by the VUs executing the test object
ErrorsThe number of errors occurred by the VUs executing the test object
TimeoutsThe number of timeouts occurred by the VUs executing the test object


Agent Progress (for distributed tests only)


Column NameDescription
NameThe load agent name
Users

The number of active VUs executing on the load agent

Iterations StartedThe number of started test iterations on the load agent
Iterations PassedThe number of test iterations completed successfully on the load agent
Iterations FailedThe number of test iterations that failed on the load agent
Requests SentThe number of issued requests on the load agent
Responses ReceivedThe number of received responses on the load agent
ErrorsThe number of errors occurred on the load agent
TimeoutsThe number of timeouts occurred on the load agent


Windows machines

These graphs show the Windows machine counters that are described here.


Column NameDescription
WarningsThe number of warning threshold violation
Errors

The number of critical threshold violation

Linux/Unix Servers counters

This graph contains curves that monitor a set of performance parameters configured in the Other Options -> Monitoring -> Linux Unix Servers Monitoring section. The counters are collected from the remote Linux or Unix servers involved in the test using the SNMP protocol.


  • No labels