Last updated: Nov 12, 2019 16:20
The Test Analysis views provide a very good overview of the trends and highlights of your test programming. To get more specific information about a particular test, drill down to the Single Test Report (STR) by clicking the test report in the Report Library.
In this section:
Single Test Report overview
Clicking the name of a test report in the Report Library opens the specific Single Test Report (STR), as shown in the following figure.
The left panel of the STR View shows a list of the logical steps that comprise the test (as named in the testStep() method).
Reports for native automation executions that activate nested scripts include the steps of both the main script and the nested script. The commands of the nested scripts are identified by a special symbol ("</>").
Clicking a logical step reveals a view of the artifacts (video, screenshots, expected vs. actual values) associated with the particular command/step.
Click a command to display detailed information about the command execution, including:
- Timer information: Displays the Perfecto Timer and UX Timer values when the command was executed.
- Parameter information: Identifies the following:
- The device used for the command
- The UI element (if the command accessed a UI element
- The text sent to the UI element (if the command inserted text
Note: If the text was sent as a Secured String, the text value appears as: "***"
- Other information, such as parameters for visual analysis, assertion information, or UI element attribute values
Visual artifact area
The right panel of the STR view presents visual artifacts, for example screenshots or videos from the test run.
When you view the video, the timeline includes indicators that highlight the times at which the logical steps occurred. Moving the pointer over any of these points displays a tooltip that identifies the corresponding logical step.
When the test script displayed in the STR generated an error or failure message, the message is displayed at the top of the visual artifact area.
At first, only the header line of the error message is displayed on a red background:
To see the full message, together with a stack dump (if relevant), click the down arrow below the error message:
STR header area
The Single Test Report header includes the top two rows of the STR. The header shows the following:
- The top line includes:
- Back to Report Grid button: Reverts the display to Report Library View, regardless of any navigation to other test report views.
- Name of the current test.
- The second line includes:
- Test status - shows the status of the test run of this STR.
History graph - shows five runs similar to the history graph in the Report Library view. Selecting one of the nodes of the graph navigates to the STR of the selected run. The tooltip provides information on the run for each node.
The History mechanism defines test similarity by test name and execution capabilities. If there is a difference in either the name or the capabilities (for example, the
osVersioncapability is included in one test run but not in another), the tests are considered 'not similar'. As a result, they are not connected in history.
- Run information - Start time and duration information of the test's run.
- Device information - information on the device or devices used for the test run.
- Activate interactive session icon - Opens the device in a Perfecto Lab interactive session. If the device is not available, the Perfecto Lab will notify the user to select another device.
- Tags - list of tags associated with the test run.
- JIRA bug reporting icon - appears if Smart Reporting is integrated with JIRA. Supports entering bug reports directly as a JIRA issue.
- Report Details button - displays detailed information on the test run data, and device(s) data.
- Open Support Case - Connects directly to Perfecto Support to allow you to open a new incident.
- Download button - supports accessing and downloading the artifacts (video, log files) associated with the test run.
Failure Reason display
When the test script or Smart Reporting heuristics have identified a Failure reason it will appear in the second line next to the Test status (only if the status is Failed). The Failure reason will be one of those configured for the CQ Lab by the Administrator.
If no Failure reason was identified for a failed test, the Failure reason will be displayed as Add failure reason. This allows you to update (see below) the test report with a valid Failure reason.
Add (update) the failure reason
If no Failure reason was assigned to the test that failed, you can add a new Failure reason (or update the current Failure reason) by:
- Click on the Add failure reason button (or button with current Failure reason)
- A menu showing all configured Failure reasons is displayed.
- Select the correct Failure reason from the list.
- Failure reason is updated.
Some Notes regarding updating the Failure reason:
- If a Failure reason is assigned already, the menu will include an entry (Clear failure reason) that removes the assigned Failure reason (leaving the test report without any Failure reason).
- If the assigned reason is a Blocked reason - you cannot update the reason.
View report details
Use the Report Details button in the upper right corner of the STR View (see in figure above). This displays a popup window with information details related to the Test Run.
The popup includes two (or more) tabs of information:
- EXECUTION tab - displays the data associated with the test run including:
- Basic execution data
- Job Information (see above)
- Project Information
- Custom Field names and values (see above)
- Any Tags associated with the run.
- DEVICE tab - displays information regarding the device used for the test. (For multiple device tests see next section.) Information includes:
- Device name and manufacturer
- Device ID
- OS version and firmware
Access source code from the STR
Sometimes the test run did not complete as expected and there is a failure status. In many of these cases it is easier to understand what went wrong or what needs to be fixed in the test script, if you could view the source code. This functionality is dependent upon the tester supplying the information as described in the article here.
The source code will be displayed in a new browser tab. There are two access points for the source code:
- If the STR displays an error message, open the error message, and at the bottom there are links to open the source file display (see below).
- For all STR click, on the Report Details Button - At the bottom of the Details popup window there are links to open the source file display (see below).
Access source information links
There are three configured links to access source code information:
- Open commit link - Displayed if the pefecto.vcs.commit custom field was set by the test run.
- Open source file link - Displayed if the pefecto.vcs.filePath custom field was set by the test run.
- Open source file and commit link - Displayed if neither pefecto.vcs.commit nor pefecto.vcs.filePath custom field was set by test run.
If both custom fields were set by the test run, both links 1 & 2 are displayed.
Hovering over link 3 will display a tooltip encouraging the user to set the custom fields for future test runs.
The links are displayed either:
- Error Message, when you pull down to see the full error message, at bottom of the display (as shown above).
- In the Report Details window, at the bottom
Display the source code
When clicking on one of the Open commit or Open source file links, Smart Reporting opens a new tab in the browser and browses directly to the VCS at the display of either the commit or the source file.
You can download a PDF version of the test report to your local workstation through the Download menu, by clicking on the download button.
Test reports include the basic information on the test execution displayed in the STR, parts of which may be downloaded to your local workstation using the Download menu. In addition, the test may attach additional log files - of the device activity, network activity, or other vitals information - as artifacts.
Application crash report
There are times when a test "fails" because the application under test crashes - actually a case where the test succeeded in uncovering an application fault. When this occurs the mobile operating system generates a crash log that includes information that the application developer can use to identify the cause of the crash.
The Perfecto system identifies these situations and retrieves the crash log from the device and notifies the Smart Reporting analytics of the status. Smart Reporting will identify the failure reason for this type of test as Application crashed, and will add the crash log as an artifact to the test report.
This is supported on all iOS versions, Samsung devices running Android 7.0 and above and other Android devices running Android 6.0 and above.
To retrieve the Crash Report for the test,
- Click the download menu button for the test report.
- Click on the App-Crash-Report entry to download the log file.
Tests on multiple devices
When a Perfecto Native Automation test script allocated multiple devices to run the test, the reporting system gathers artifacts (screenshots, video) from all of the devices involved. At the completion of the test execution the report for the test run generates a single report that includes the artifacts from all devices.
Multiple devices in the Report Library view
Test-runs that activated multiple devices will be listed in the Report Library View with the following indications that multiple devices were involved:
- Platform type column - indication of number of devices used.
- Device column - list of all devices used.
- OS column - list of all OS versions used, corresponding to the devices listed.
- Resolution column - list of the device resolution for each device used.
Multiple devices in the Single Test Report view
When drilling down to the STR of a test that activated multiple devices -
- The device button on the test status line will indicate the number of devices involved in the test run. Hovering over the button will open a tooltip that indicates the names and OS version of the devices involved.
- Clicking on the button will display the Report Details window, with tabs for each of the devices involved in the test run.
- The video shows all devices involved in the test run. Control which devices to display (or hide) using:
- The Show devices menu - checked devices are displayed, unchecked devices will not be displayed.
- The Remove device button - to stop displaying the individual device.
- Screenshots are available from all of the devices.