Performance report charts are empty and averages reported as N/A

book

Article ID: 222677

calendar_today

Updated On:

Products

BlazeMeter

Issue/Introduction

Unable to see data in the Summary report after running a BlazeMeter Performance test.  On the report, the charts are empty and N/A is displayed for most parts of the Summary Panel.

Cause

Test was set up to iterate through the test plan 1 time for 2 configured Virtual Users.  The test was also configured to ramp up over 40 minutes in 2 ramp-up steps.

When the test started, all of the elements of the test were executed 1 time and those metrics were collected for the first virtual user (VU).  Then the test went idle for the next 20 minutes (waiting for the next step of the ramp-up interval).  20 minutes later the second VU executed all of the elements of the test and then BlazeMeter stopped the test since the test was finished (all threads were executed).  At this point in time there were enough metrics to populate the Summary Report but the test was also finishing up.

BlazeMeter will not show a summary report until 2 or more metrics are collected for the elements of a test.  20 minutes into the test, a second set of metrics were collected, but the test also ended.  This caused the Summary Report to be displayed, but the data was still being uploaded which caused the report to be empty.

Environment

Release : SAAS

Component : BLAZEMETER PERFORMANCE TESTING

Resolution

After all of the test data is uploaded and the Summary page refreshes, the collected test metrics will be displayed and the entries in the Summary Panel will no longer display N/A.

This type of test configuration is not recommended for a BlazeMeter Performance test.

BlazeMeter Performance tests are designed to allow users to ensure that Their application server will be able to handle the full load of users performing various actions all at once as soon as the application goes live.  BlazeMeter assumes that Performance tests are actively collecting metrics over the course of the duration of the test.  With the current design, BlazeMeter will prematurely stop tests that are not actively collecting metrics for a certain period of time.

Small numbers of iterations and long intervals between ramp-up steps can result in BlazeMeter prematurely stopping a test.

For example, a test set up for 24 users to use 24 ramp-up steps over a period of 24 hours will see the test end after the first hour since no data is uploaded for a full hour (the maximum time that BlazeMeter will wait for data to be uploaded before forcing the termination of a test).

This behavior is by design.