Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

The Report Library presents a detailed overview of the tests run within a selected period of time, including:

  • Statistical information for all listed tests
  • Historical breakdown of when tests failed
  • Table of all selected tests, including information on the parameters of the test run

For information on how to filter and group the view, see Focus on what's important to you.

This figure shows the Report Library grouped by device model.

Panel

On this page:

Table of Contents
maxLevel2
stylesquare

Overview

The following sections explain how to read and use the different parts of the Report Library view.


UI Expand
titleHeader bar

The header bar across the top of the display (just below the top-level Test Analysis tabs) displays:

  • Saved view selector - If you have any pre-configured views, select to view the current data based on the configured selections.
  • Active filters - Lists all of the currently selected filters and groupings for the list.
  • Search field - Lets you search for keywords to display only tests with names that match the entered term.
UI Expand
titleStatistic charts

The section just below the header bar includes statistical views of the listed tests:

  • Statistical Overview shows general statistics of the currently listed tests, split into the number of passed, failed, blocked, and other tests.
  • Run History provides an overview of dates when tests were run. The view is color-coded to show the number of passed, failed, blocked, and other tests for a particular date.

Use the Hide charts/Show charts link above the charts to toggle the display.

UI Expand
titleTest details table
Table of Content Zone
maxLevel2
minLevel2
typeflat
separator|

The table in the Report Library view lists the following information for all test reports.

ColumnSub-columnDescriptionDisplayed by default
Report Name
The script name as supplied in the PerfectoReportiumClient testStart() method. The column includes a checkbox that you can use to select test reports for the Cross Report view.Yes
Status

The final status reported by the specific test run. The test status is one of: Passed, Failed, Blocked, Unknown.

Yes
Cross Report
Indicates if the test report is part of a cross-report.Yes
History

Indicates when this test run was executed relative to other runs of the same test. (See below for more detail.)

Yes
Failure/Blocked Reason
Indicates a reason detected, either automatically by Smart Reporting heuristics or by the testers (see Implement the reporting framework), why the test failed. For more information, see below.Yes
Platform
Indicates if tests were executed on Mobile or Web devices. May include an indication of the number of devices executed on.Yes
Form factor

Icon that identifies if the testing device was a Mobile or Desktop Web device.

Yes
Device model

List of devices used in the test, with each device separated by comma.

Yes
Device ID

For mobile devices, provides the ID number of the testing device.

No
BrowserBrowser version for Web devices.Yes
OSList of operating system versions, coordinated with devices list.Yes
Resolution

List of device resolutions, coordinated with devices list.

No
Job
Details of the CI Job reported for this test.No
Name

Job name as reported in the execution context. 

No
#NumberJob number as reported in the execution context.No
BranchJob branch as reported in the execution context. No
Time
Provides details of when test was run and its duration.Yes
StartStart time of the test run.Yes
DurationDuration time of the test.Yes
Tags
Indicates the number of tags associated with the test run.Yes
Lab
Indicates if test was run in a Perfecto CQ Lab.No
Automation Framework
indicates the automation framework supported by the device used by automation script. No

Test history information

The history graph in the History column shows a series of nodes. Each node represents a single run of the test.

Excerpt
Info
iconfalse

The History mechanism defines test similarity by test name and execution capabilities. If there is a difference in either the name or the capabilities (for example, the osVersion capability is included in one test run but not in another), the tests are considered 'not similar'. As a result, they are not connected in history.

The test run whose details are described in this report is displayed as a double-ring in the history (Image Modified). This makes it easier to identify when this test run was executed relative to other test runs. When reading the history graph, keep in mind that: 

  • The latest test run is always represented by the right-most node.
  • No more than five nodes appear in the history graph. If the specific run occurred prior to the five latest runs, the graph shows a break represented by three dots (Image Modified).
  • The color of the node represents the test result status for that particular run, where green means 'passed', yellow means 'blocked, and red means 'failed' (Image Modified).
  • Move the pointer over a node to display a tooltip with details for that run.

  • Arrows around an icon () identify test runs with scheduled retries that have been collapsed into a single test report.

    Info
    iconfalse

    This feature The Scheduled Retries is turned off by default. To turn it on in your cloud instance, contact Perfecto Support.

    For a test to be considered a retry, it must share the same parameters and CI job name and number or be part of the same execution. Perfecto does not list a test that is considered a retry in the table and does not take it into account when calculating statistics. Only the last test in a retry series makes it into the statistics. For more information, see STR.

Anchor
failBl
failBl
Failure/blocked reason

Smart Reporting is designed to provide the test data from your test runs, but also to allow you - the tester or the test manager - to better understand the results of your tests. If your test ends with a success status, then you know that all is well. However, when a test fails, Smart Reporting may analyze the test data and provide a failure reason that indicates what caused the test to fail.

The Smart Reporting system provides functionality that supports generating this failure reason classification, either:

  • Manually by the test script, based on the stage in the script execution when the test is determined to have failed.
  • Automatically. Smart Reporting analyzes the entire test information and generates the reason based on a heuristic classification.

In either case, the failure reason is color-coded (where green means 'passed', yellow means 'blocked, and red means 'failed') and displayed in the table of the test reports, in the Failure/Blocked Reason column, to allow a quick overview of the different failed tests. If a test fails without reporting a failure reason, the Status column will show a red icon, but the Failure/Blocked Reason column will be blank.

The Smart Reporting system automatically identifies instances when the test failed before the test was able to start. When these failures occur, the report is marked with a blocked failure reason (that will appear in the table as the Failure Reason, color of reason in yellow) and the Status column will show a Blocked status (). These blocked failure reasons are completely controlled (failure reason text and color) by the Smart Reporting system. Some examples of blocked failure reasons include:

  • Device in use

  • Device not connected

  • Device not found

Tags column

You can move the pointer over the tag icon in the Tags column for a specific test to display a tooltip that lists all tags associated with the test.

This tooltip is an active tooltip. Clicking a tag filters the table to show only tests associated with that tag.

Multimedia
nametagTooltipAction.mp4
width854
height458

UI Expand
titleSidebar

Anchor
sidebar
sidebar
Use the sidebar on the right to:
  • Filter the list of displayed test runs, to limit the list to those that comply with the selected parameters or time frame.

  • Create groups of test runs to display in the groups overview.

For more information on the sidebar, read here.

Anchor
focus
focus
Focus on what's important to you

The following sections explain how you can group and filter information in the Report Library to narrow your focus on the details you need to see.

Anchor
searchName
searchName
Search reports by name

When the Report Library includes a wide variety of tests, you can use the Search field at the top to focus on the set of tests you are interested in.

To search reports:

  1. Start typing the name (or any sub-string of the name). Smart Reporting presents a list of suggested available test names.
  2. Do one of the following:
    • Select the test name from the list of suggestions.
    • Complete typing the requested name and press enter.

The list refreshes to display only tests with names that match the search term. If no tests match the search term, a 'No reports found' message appears.

Anchor
statFilter
statFilter
Filter by status

You can use the Statistical Overview area to focus on reports with a selected final status:

  • Click a counter (FailedPassedBlockedUnknown) to filter the list to show only those reports with the selected status.
  • Click Tests to list all test reports.

For example, if 26 tests have a status of Blocked, clicking the Blocked counter filters the list to display only the 26 reports that are currently blocked.

Anchor
configColumns
configColumns
Configure columns

The table may show information that is not relevant at the moment. You can configure it to hide or show columns as needed.

To configure columns:

  1. Click the pliers icon  above the table to display a menu of columns you can configure to appear in the Report Library table.

    Info
    iconfalse

    Custom fields defined in a test appear as column names at the end of the list.

  2. Select the box next to columns you want to include in the table or clear check boxes of columns you don't want to display.

    Info
    iconfalse
    The Report Name and Status columns are not configurable. They are always selected.

Anchor
filterTags
filterTags
Filter by tags

Use the tags defined by the different context classes to better focus the analysis of the test results by filtering the tests to the subset of interest. See Test analysis: best practices on how to optimize your test tags for efficient test analysis.

To filter by tags, do one of the following:

  • In the Tag column, click the tag icon. In the tooltip that displays, click the tag you want to filter by.
  • In the sidebar, expand the Tags section and select one or more relevant tags.

The following figure shows the table filtered by the tags iPhone 7 and XCTest. Note the indications of the filtering in the right pane.