Single test report (execution report)

The Test Analysis views provide a good overview of the trends and highlights of your test programming. To get more specific information about a particular test, drill down to the Single Test Report (STR), also commonly referred to as execution report or simply report, by clicking a test report in the Report Library.

Overview

Clicking the name of an execution report in the Report Library opens the specific STR, as shown in the following figure. The report displays a list of the logical test steps on the left and a visual area for screenshots, video, and text artifacts, if available, on the right. The visual area includes a video and vitals timeline at the bottom, with the timeline points corresponding to the logical steps on the left.

Restriction: Depending on the testing framework you use to run your tests, not all features described in this section may apply. For example, if you work with XCUITest or Espresso, you do not see individual test steps, you cannot add custom failure reasons from within tests, and the report may only include a single screenshot. For details on reporting limitations, see the dedicated framework documentation in the Automation testing section.

View report details

Use the Report Details button  in the upper right corner of the STR view to display the Report Details dialog box. This dialog box shows information related to the selected test, as follows:

  • EXECUTION tab: Displays data associated with the test run, including:

    • Basic execution data

    • Job information

    • Project information

    • Custom field names and values

    • Any tags associated with the run

  • DEVICE tab: Displays information about the device used for the test. (For multiple device tests, see Tests on multiple devices below.) Information includes:

    • Device name and manufacturer

    • Device ID

    • OS version and firmware

    • Resolution

    • Location

Access source code

Sometimes, a test run does not complete as expected and results in a status of Failed. In such a case, it is often easier to understand what went wrong or what needs to be fixed in the test script if you can view the source code. This functionality is dependent upon the tester supplying the information, as described in Access source code. Perfecto displays source code in a new browser tab.

Links to source code are configurable via custom fields set by the test run.

  • The Open commit link displays if the test run sets the pefecto.vcs.commit custom field.
  • The Open source file link displays if the test run sets the pefecto.vcs.filePath custom field.
  • The Open source file and commit link displays (but appears inactive) if the test run does not set any of these custom fields. Moving the pointer over this link opens a tooltip encouraging you to set the custom fields for future test runs.

 If the test run sets both custom fields, both fields display (Open source file and Open commit field).

To open the source code from the STR:

  • If the STR displays an error message, do the following:
    1. Open the full error message. 
    2. Below the message, click Open source file.
  • For all STRs:
    1. Click the Report Details button .
    2. At the bottom of the Report Details dialog box, click Open source file.

Download artifacts

Use the download button  at the top right to download any of the following resources to your local machine:

  • Full PDF report and assertions report that include basic information on the test execution displayed in the STR. For details, see Formatted PDF reports.

  • Video of the test run. Perfecto records a video of the whole execution.

  • Network files (HAR files). For information on setting up HAR files, see Generate and analyze HAR files.

  • Vitals. For information on the vitals collected during manual testing, see Collect device or application vitals.

  • Device logs with data that can help you understand execution problems. Device logs for manual and automation tests are available as .zip files. For information on viewing the device log during manual testing, see View the device log.

  • Textual files that were uploaded during test execution.

  • Cypress logs if you work with the Cypress framework.

Application crash report

There are times when a test fails because the application under test crashes. This is actually a case where the test succeeded in uncovering an application fault. When this occurs, the mobile operating system generates a crash log that includes information that the application developer can then use to identify the cause of the crash.

Perfecto identifies these situations, retrieves the crash log from the device, and notifies the Smart Reporting analytics of the status. Smart Reporting identifies the failure reason for this type of test as Application crashed. It adds the crash log as an artifact to the test report.

This is supported on all iOS versions, Samsung devices running Android 7.0 and later, and other Android devices running Android 6.0 and later. 

To retrieve the crash report for the test:

  1. In the upper right, click the download button  for the test report.
  2. From the menu, select App-Crash-Report to download the log file.

Tests on multiple devices

When a Perfecto Native Automation script allocates multiple devices to run the test, the reporting system gathers artifacts (screenshots, video) from all of the devices involved. At the completion of the test execution, the report for the test run generates a single report that includes the artifacts from all devices.

Restriction: Displaying a single report for multiple devices is available only for Perfecto Native Automation.