Please enable JavaScript.
Coggle requires JavaScript to display documents.
6.TOOL SUPPORT FOR TESTING (MAIN PRINCIPLES FOR TOOL SELECTION (is it…
6.TOOL SUPPORT FOR TESTING
TEST TOOL CLASIFICATION
tool support for test execution and logging:
test execution tools (to run regression), coverage tools, test harnesses (D), unit test framework tools (D)
tool support for performance measurement and dynamic analysis:
support for performance and load testing, cannot be done effectively manually; performance testing tools, monitoring tools, dynamic analysis tools (D)
tool support for test design and implementation
: help in creating test cases, procedures and test data; test design tools, model-based testing tools, test data preparation tools, ATTD and BDD tools, test driven development tools (D)
tool support for static testing
: tools that support reviews, static analysis tools (D)
tool support for specialized testing needs
: data quality assessment, data conversion and migration, usability testing, accessibility testing, localization, security, portability
tool support for management of testing and testware
: applied to any activities during SW development lifecycle; test management tools, application lifecycle management tools, requirements management tools, defect management tools, configuration management tools, CI tools (D)
PURPOSES
: improve the efficiency of test activities by automating repetitive tasks tasks that cannot be executed manually, or by supporting manual activities, improve quality by more consistent testing and highter level of defect reproducibility, increase reliability of testing
probe effect
- the consequence of using intrusive tools i.e. tools that may affect the actual outcome of the test,
eg. get different time response
MAIN PRINCIPLES FOR TOOL SELECTION
is it available for free trial period?
evaluation of the vendor or support for open source tools
evaluation the tool against clear requirements
internal requirements for coaching and mentoring in the use of the tool
understanding of technologies used by test object, to choose proper and compatible tool
evaluation training needs
identification of opportunities
pros and cons of licesing models (commercial vs open source)
maturity of organization
estimation of a cost-benefit ratio
a proof-of-concept evaluation should be done
SPECIAL CONSIDERATIONS
TEST EXECUTION TOOLS
keyword-driven testing
- generic script processes keywords describing actions to be taken, which calls keyword scripts to process the associated test data
M
odel-Based testing (MBT) tools
, specification captured in a form of a model, eg. activity diagram, performed by system designer. MBT tool interprets the model to create TC specifications.
data-driven testing
- test inputs and expected results are separated into a spreadsheet, generic test script can read data and execute the script with different data
TEST MANAGEMENT TOOLS
These tools need to interface with other tools to produce some info in a format that fits the needs of the organization, to maintain consistent traceability to requirements management tool, to link with test object version info in the config. management tool
BENEFITS AND RISKS OF AUTOMATION
BENEFITS
greater consistency and repeatability
more objective assessment
reduction in repetitive manual work = saving time
easier access to information about testing (graphs, rates)
RISKS
an open source project may be suspended
possibly no clear ownership of the tool (mentoring, poor response for updates)
time / cost / effort for initial introduction, training, need for changes
a new platform or tool may not be supported by the tool
unrealistic expectations for the tool, rely on it too much
version control of test assets may be neglected
OBJECTIVES FOR USING PILOT PROJECTS TO INTRODUCE TOOLS
deciding on standard ways of using, managing, storing, maintaining the tool and test assets, eg. naming conventions, librariers
check if benefits with be achieved at reasonable cost
evaluating how the tool fits with processes
understanding the metrics, configuring the tool to ensure these metrics can be captured and reported
gaining in-depth knowledge about the tool
SUCCESS FACTORS
gathering usage info from the actual use of tool
defining guidelines for the use of the tool (internal standards)
monitoring tool use and benefits
providing training, coaching, mentoring for users
providing support to the users of a given tool
adapting and improving processes
gathering lessons learned from all users
gradually spreading the tool amongst the organization