In this section:
|Table of Contents|
The basic grid view includes the following information areas and options:
The bar across the top of the display (just below the top-level Test Analysis tabs) displays:
- Saved view selector - If you have any pre-configured views - select to view the current data based on the configured selections.
- Active filters - lists all of the currently selected filters and groupings for the list.
- Search by test name field (see next section) - field supports listing only the results whose test name matches the request (if any exist).
When there is a wide variety of tests listed in the Report Library grid, you can use the Search by Name field to isolate the set of tests you are interested in:
- Start typing the name (or any sub-string of the name). Smart Reporting presents a list of suggested available test names.
- Either select the test name from the list of suggestions or complete typing the requested name, and activate the search.
- The list refreshes to display only the tests whose name matches the search term.
- If no tests match the search term, the Report Library will notify you that there were no matches.
Just below the heading bar, there is a section that includes two statistical views of the listed tests -
- Statistical Overview - general statistics of the currently listed tests, split into the number of passed, failed, blocked, and other tests.
- Run History - overview of dates when tests were run - color coded to show the number of passed, failed, blocked, and other tests for that particular date.
This section can be hidden/displayed by toggling the Hide charts/Show charts button displayed on the top right of the section.
Use the Statistical Overview statistics to narrow the list of reports to focus on the reports with the selected final status:
- Click on one of the statistic counters (Failed, Passed, Blocked, Unknown) to filter the list to show only those reports with the selected status.
- Click on the Tests counter to list all test reports.
For example, Clicking on the counter of 26 Blocked in the list below, filters the list to display only those reports whose status is blocked.
Test details table
The table, in the center of the Report Library View, lists all of the test reports with the following information in the columns:
- Report Name - the script name as supplied in the PerfectoReportiumClient testStart() method.
- The column includes a checkbox, used to select test reports for the Cross Report view.
- Status -the final status reported by the specific test run. The test status is one of: Passed, Failed, Blocked, Unknown.
- Test-set indicator - indicator that the test report is included in a test-set.
- History - indicates when this test run was executed relative to other runs of the same test. (See below for more detail.)
- Failure/Blocked reason - Indicates a reason detected, either automatically by Smart Reporting heuristics or by the testers (see Implement the reporting framework), why the test failed. For more information, see below.
- Platform -Indicates if tests were executed on Mobile or Web devices. May include and indication of the number of devices executed on.
- Form factor - identifies (in icon format) if the testing device was a Mobile or Desktop Web device.
- Device - List of the devices used in the test. Each device separated by comma.
- Device ID - for mobile devices provides the ID number of the testing device. (not displayed by default)
- Browser -browser version for Web devices.
- OS - List of operating system versions, coordinated with devices list.
- Resolution - List of device resolutions, coordinated with devices list. (not displayed by default)
- CI Job Information - details of the CI Job reported for this test. (not displayed by default)
- Name - Job name as reported in the execution context. (not displayed by default)
- Number - Job number as reported in the execution context. (not displayed by default)
- Branch - Job branch as reported in the execution context. (not displayed by default)
- Time - Provides details of when test was run and duration.
- Start - Start time of the test run
- Duration - duration time of the test.
- Tags - Indicates number of tags associated with the test run.
- Lab - indicates if test was run in a Perfecto CQ Lab. (not displayed by default)
- Automation Framework - indicates the automation framework supported by the device used by automation script. (not displayed by default)
- Additional columns may be displayed containing the values of the custom fields associated with the tests.
The history graph shows a series of "nodes" where each node represents a single run of the test. The test run whose details are described on this row of the grid is displayed as a double-ring in the history. This makes it easier to identify when this test run was executed relative to other test runs:
- The latest test run is always represented by the right-most node.
- No more than five nodes appear in the history graph. Therefore, if the specific run is "older" than the five latest runs, the graph shows a break
- represented by three
- dots ().
- The color of the node represents the test result status for that particular run, where green means 'passed' and red means 'failed'.
- By hovering over a node, a tooltip appears
- , providing details for that run.
Smart Reporting is designed to not only provide the test data from your test runs, but also to be tool that allows you - the tester or the test-manager - to better understand the results of your tests. If your test ends with a success status then you know that all is well. However, when a test fails, Reporting Zoom may analyze the test data and provide a failure reason that indicates what caused the test to fail.
The Smart Reporting system provides functionality that supports generating this failure reason classification, either:
- Manually by the test script, based on the stage in the script execution when the test is determined to have failed.
- Automatically. Smart Reporting analyzes the entire test information and generates the reason based on a heuristic classification.
In either case, the failure reason is color-coded and displayed in the grid of the test reports, in the Failure Reason column to allow a quick overview of the different failed tests.
The Smart Reporting system
identifies instances when the test failed before the test was able to start. When these failures occur, the report will be marked with a blocked failure reason (that will appear in the table as the Failure Reason, color of reason in yellow) and the Status column will show a Blocked status (). These blocked failure reasons are completely controlled (failure reason text and color) by the Smart Reporting system. Some examples of blocked failure reasons include:
Device in use
Device not connected
Device not found
By hovering over the Tags value for a specific test, a tooltip listing all the tags associated with the test is displayed.
This tooltip is an active tooltip - clicking on one of the tags will filter the list of tags to all tests that have that tag associated with them.
Note: Selecting a tag from the tooltip, replaces any existing tag filter with the single tag that was selected.
The following shows a filtered list - based on the "Regression" tag: note the indications of the filtering.
Use the tags defined by the different context classes to better focus the analysis of the test results by filtering the tests to the subset of interest. See Test analysis: best practices document on how to optimize your test tags for efficient test analysis.
Use the sidebar to:
- Filter the list of displayed test runs, to limit the list to those that comply with the selected parameters or time-frame.
- Create groups of test runs to display in the groups overview.
For more information on the sidebar