Please enable JavaScript.
Coggle requires JavaScript to display documents.
3. Static testing techniques - Coggle Diagram
3. Static testing techniques
Value of Static testing
Detect defects
in the
earliest
Identify
defects
which
can not be detected by dynamic testing
provide the ability to evaluate the quality and to build the confidence in work products
Improve communication
reduce costs, time, effort
need to be spent on fixing defect later in the project
Code defects can be detected by static analysis is
more effectively
than in dynamic testing
Differences between Static testing and Dynamic Testing
Static Testing
Find defects, applied to non-executable work products
Includes:
Review (manual)
Static analysis (eg: complier, Jenkins)
Typical defects
Review
:
Defects in requirements
Design defects
Incorrect interface specifications
Gaps or inaccuracies in test basis coverage
Static analysis (tool):
Coding defects
Deviations from standards
Specific types of security vulnerabilities
Maintainability
Dynamic Testing
Find failures, only be applied to executable work products
Includes:
Techniques to design test case, test data, test input, expected results
Retest, regression test, automation test, dynamic analysis
Typical defects:
Failures
Poor non functional (performance, security)
Code coverage
Memory leak (dynamic analysis)
Review process
Review process & Responsibility
Planning
:
Defining
:
-- scope, purpose
-- the work product to be reviewed
-- quality characteristics to be evaluated
-- areas to focus on
-- exit criteria
-- supporting information such as standards
-- effort and the timeframes
Roles
:
--
Management
:
decides
what is to be reviewed, provide resources & time
--
Facilitator (Moderator):
Create
plan for review, lead of review
Initiate Review (Kick off meeting)
The goal:
-- Everyone and everything involved is prepared to start the review
-- access to the work product under review
-- understands their roles and responsibilities
-- receives everything needed to perform the review
Individual Review (preparation)
Main tasks:
-- Self review
-- Identify and log anomalies, recommendations and questions by apply one or more review techniques
Roles:
-- Scribe (recorder): collects anomalies from reviewers
Issue communication and analysis (Review meeting)
Main tasks:
-- analyze and discuss all anomalies
-- made the decision on its status, ownership and required actions
-- decide what the quality level of reviewed work product
-- follow-up actions are required
Roles:
-- Scribe (recorder): records review information (decision, new anomalies)
Fixing (Rework)
Main tasks:
-- a defect report should be created
-- corrective actions can be followed-up
Roles:
-- Author: update the document
Reporting (Follow up)
Main tasks:
-- Once the exit criteria are reached, the work product can be accepted
-- The review results are reported
Review Types
Informal Review
(Pair review)
Purpose
Detect anomalies
Cheap to find defect
Leader
: No
Review Process
No process
Not require a formal documented output
Formal Review
Walkthrough (Demo)
Purpose
Educate reviewers
Generate new ideas
Leader
: Author
Review process
Follow review process
Individual preparation is optional
Technical Review (Peer review)
Purpose
Gain consensus
Make decision regarding a technical problem
Leader
: Moderator
Review process
Follow review process
Management participations is optional
Inspection
Purpose
Find the maximum number of anomalies
Leader
: a trained moderator
Review process
Follow most formal review process
Mandatory:
-- Metrics are collected
-- improve the SDLC
-- The author can not act as the review leader or recorder
Review Techniques