Please enable JavaScript.
Coggle requires JavaScript to display documents.
9.2.4 - Testing & Evaluating Software Solutions - Coggle Diagram
9.2.4 - Testing & Evaluating Software Solutions
Testing the software
comparison of the solution with the design specifications
generating relevant test data for complex solutions
comparison of actual with expected output
Levels of testing
Module
Test that each module and subroutine functions correctly
use of drivers
Program
test that the overall program (including
incorporated modules and subroutines)
functions correctly
System
test that the overall system (including all programs in the suite) functions correctly, including the interfaces between programs
acceptance testing
Use of
live test data
to ensure that the testing environment accurately reflects the expected environment in which the new system will operate
L
arge file sizes
M
ix of transaction types
R
esponse times
V
olume of data (load testing)
E
ffect of the new system on the existing systems' environment
Reporting on the testing process
documentation of the test data and output produced (see Course Specifications document)
use CASE tools
communication with those for whom the solution has been developed, including:
test results
comparison with the original design
specifications
Evaluating the software solution
verifying the requirements have been met appropriately
Quality assurance
assess the new software solution to ensure that it meets the specified
quality assurance criteria
assess the performance of the new software solution against the criteria specified by the
benchmark
Post implementation review
facilitation of open discussion and evaluation with
the client
client sign off process