Please enable JavaScript.
Coggle requires JavaScript to display documents.
Chapter 5: Test Automation Reporting and Metrics (Implementation of…
Chapter 5: Test Automation Reporting and Metrics
Selection of TAS Metrics
External TAS metrics
Automation benefits
Effort to build automated tests
Effort to analyze automated test incidents
Effort to maintain automated tests
Ratio of failures to defects
Time to execute automated tests
Number of automated test cases
Number of pass and fail results
Number of false-fail and false-pass results
Code coverage
Internal TAS metrics
Tool scripting metrics
Automation code defect density
Speed and efficiency of TAS components
Trend metrics
Implementation of Measurement
Features of automation that support measurement and report generation
Integration with other third party tools (spreadsheets, XML, documents, databases, report tools)
Visualization of results (dashboards, charts, graphs)
Logging of the TAS and the SUT
TAS logging
Test case start and end time
status of the test case execution
Detail of Test logs +Timing
Dynamic Information of TAS eg memory leaks
Counter is needed to logging - in case of reliability testing / stress testing
random number/choices should be logged (if test cases have random parameters)
All actions a test case performs should be logged - for play back
Screenshots
crash dumps and stack traces should be saved by the TAS to
a safe location
Use of color - errors in red , progress in green
SUT logging
include date and time stamps, source location of issue, error messages
log all user interaction
Info about SUT should be logged
different software/firmware versions
configuration of the SUT
configuration of the operating
system
Test Automation Reporting
Content of the reports
overview of the execution results
The system
being tested
Test Environment information
Publishing the reports
Upload to a website
send by email
use test management tool