To select Error View, click Error on the toolbar (a).

Every request issued during the load test receives one of three flags: Passed, Failed with Error, and Failed with Timeout, which is determined as follows:

    1. Failed with Error: The response came back before a given timeout, but it violated a custom validator's criteria or has a failing code of 400 and above.
    2. Failed with Timeout: The corresponding response did not come back before a given timeout.
    3. Passed: None of the above. This request is considered successful.


Error View displays responses with errors and timeouts. A panel shows information about failed request instances grouped by a Test Case (b) and a request number (c). Below is the example displaying 19 errors of request # 38 in TC Group 1. This request failed eight times during iteration 1 and 11 times during iteration 2. Failing users are displayed in the VU column (d). To view the selected error, double-click it or right-click on a grid and click Show Sessions (e). The sessions will be retrieved from the test log and displayed in the session grid.


For quick error analysis, select a row, right-click, and select Compare with Recorded (f). The system will determine which user in which iteration encountered this error and then will display a tree that compares all replayed sessions for this user's iteration with the recorded sessions. This can help to discover preceding and subsequent errors and find out the root cause of the error faster. Another troubleshooting option is in the error view. Right-click on the error and select See Waterfall (g) for this VU iteration.


To export the error view details to .csv file, select  Export Details to CSV...  (h).

Reporting data column for error description

Sporadic test errors typically occur due to parameterized dataset inputs. For example, logging in using login credentials from a dataset.

When a test error occurs, it can be beneficial to know what row was used during the error's iteration. Each row can be identified by its reporting value.

To add the row's reporting value to the error description, in the dataset property grid, set the Reporting data column property to the column that contains the reporting value.

During the test, if an error occurs, the reporting value of every dataset will be added to the error description.

Reporting value evaluation

The reporting value of every dataset is evaluated on the first use of every iteration. If an error occurs before a dataset was used, that dataset's reporting value will not be added to the error's description.


  • No labels