This section describes the multiple performance graphs and the containing curves.
Each graph has a Users curve which plots the current number of running VUs
Key Performance Indicators (KPI)
|Users||The number of instantiated and active VUs, iterating through their respective test cases. Some VUs can complete the test earlier than others and become inactive. VUs which completed all their iterations before the test ends are excluded from the active user count. If the test is configured to complete only after all VUs complete their iterations, then the User graph will show a gradual declining the number of VUs at the end of the test. This curve is present in every graph.|
|Requests/Sec||The number of requests being sent per second. Calculated by taking the number of requests from the previous sample timestamp and dividing by the sample rate.|
|Avg. Response (s)|
The average response time of the sent requests. This is the average value since the previous sample timestamp. The data point is skipped to reflect an accurate response time curve if no responses were received since the previous sample timestamp.
|KB Received/Sec||The number of kilobytes currently received per second. Calculated by taking the total payload downloaded from the previous sample timestamp and dividing it by the sample rate.|
|Errors/Sec||The number of errors currently received per second. Calculated by taking the number of request errors from the previous sample timestamp and dividing it by the sample rate.|
|Pending Requests||The number of currently pending requests. Pending requests are issued, but StresStimulus does not yet receive a response. Generally, a higher number of pending requests indicates a slower server response. This parameter can be used to gauge performance change in several test runs.|
|Total Transactions/Sec||The total number of transactions completed per second. A completed transaction executes all child objects. If transactions are nested, then each transaction is counted separately toward the total completed transactions.|
This graph monitors the average response time of every transaction in the test. A single curve exists for every transaction. Each data point is calculated using the average times of completed transactions since the previous sample timestamp. The data point is skipped if no transaction was completed since the previous sample timestamp. If the test has no transactions, then the respective graph will be hidden from the runtime dashboard and the report.
If collated transactions are enabled, then there will be one curve for all transactions with the same name.
Additional grid columns
|Iterations||The number of times this transaction was completed successfully|
The maximum point value after excluding the slowest 10%
|95%||The maximum point value after excluding the slowest 5%|
|99%||The maximum point value after excluding the slowest 1%|
|The number of transaction times greater than the Required performance time|
|Missed Goals||The number of transaction times greater than the Goal time|
The page graph and grid have the same functionality as the transactions graph described above.
Missed goal and Missed required performance
If the transaction time is greater than the Goal and Required performance, then only the Missed goal counter will be incremented. The Missed requirement counter will not be affected.
This graph monitors the average iteration time of every test case in the test. A single curve exists for every test case. Each data point is calculated by taking the average iteration times of completed iterations since the previous sample timestamp. The data point is skipped if no iteration was completed since the previous sample timestamp.
Test Object Progress
A progress grid for test cases and test case groups
|Name||The test case or test case group name (test object)|
The number of active VUs executing the test object
|Iterations Started||The number of iterations started by the test object|
|Iterations Passed||The number of iterations completed successfully by the test object|
|Iterations Failed||The number of iterations that failed by the test object|
|Requests Sent||The number of issued requests by the VUs executing the test object|
|Responses Received||The number of received responses by the VUs executing the test object|
|Errors||The number of errors occurred by the VUs executing the test object|
|Timeouts||The number of timeouts occurred by the VUs executing the test object|
Agent Progress (for distributed tests only)
|Name||The load agent name|
The number of active VUs executing on the load agent
|Iterations Started||The number of started test iterations on the load agent|
|Iterations Passed||The number of test iterations completed successfully on the load agent|
|Iterations Failed||The number of test iterations that failed on the load agent|
|Requests Sent||The number of issued requests on the load agent|
|Responses Received||The number of received responses on the load agent|
|Errors||The number of errors occurred on the load agent|
|Timeouts||The number of timeouts occurred on the load agent|
These graphs show the Windows machine counters that are described here.
|Warnings||The number of warning threshold violation|
The number of critical threshold violation
Linux/Unix Servers counters
This graph contains curves that monitor a set of performance parameters configured in the Other Options -> Monitoring -> Linux Unix Servers Monitoring section. The counters are collected from the remote Linux or Unix servers involved in the test using the SNMP protocol.