Please enable JavaScript.
Coggle requires JavaScript to display documents.
ISTQB - Testing (CHAP 1: Fundamentals of testing (Terms (Error (mistake),…
ISTQB - Testing
CHAP 1: Fundamentals
of testing
Why neccessary?
Software defects
Caused by People
By other things: ocassions, environment, damages, attackers
Typical scenarios
Defect in requirements/design specifications
Defect in implementation/ repair code, metadata or documentation
Software is not perfect
People are fallible
Quality is important
Terms
Error (mistake)
Defect (bug, fault)
Failure
Quality
False-fail result/False-positive result
Risk
False-pass result/ False-negative result
Test design specification
Test control
Test case
Test objective
Testing
Requirements
Review
Debugging
Confirmation testing (re-testing)
Test strategy
Test execution
Test approach
Test plan
Test monitoring
Test monitoring
Test condition
Test basis
Test data
Coverage (test
coverage)
Test procedure spec
(test proc, scripts, manual scripts)
Test suite
Testware
Regression
testing
Exit criteria
Test summary report
Error guessing
Independence of
testing
Test policy
Role of testing and
its effect on quality
Reduce the level of risk
Find defects
Testing -> Debugging -> Repairing
affect quality
Provide ways to measure quality
Provide learning chances
Avoid similar defects
How much testing is enough
Infinite test cases but finite time
Select most valuable tests
Project's time and budget
are considered
What should be covered,
what can be covered
what is necessary
Not just to reduce risks
but also to meet contractual or
legal requirements, or industry-specific standards
What is testing
Activities involve in testing (process)
Test planning
Test control
Test analysis
Test design
Test implementation
Test execution
Checking results
Evaluating exit criteria
Test results reporting
Test closure
Static testing
Dynamic testing
Some objectives
Finding defects
Gaining confidence in the level of quality
Providing information for decision-making, such as satisfaction of entry or exit
criteria
Preventing defects
The assessment of
the quality of the software
Assessing system characteristics such as reliability, security, performance or
availability
Debugging is not testing role
Seven testing principles
Testing shows the presence of defects
Exhaustive tesing is impossible
Early testing
Defect
clustering
Pesticide
paradox
Testing is context dependent
Absence-of errors
fallacy
Fundamental test process
Planning and control
Planing for test process,
time, humans, ...
Control the process, deadlines,
changes, risks. Report status and monitor the process
Test analysis and design
Review test basis
Evaluate the testability
Identify and prioritize specific test conditions
Design and prioritize high level (i.e. abstract or logical) test cases
Identify the necessary test data
Design the test environment
Create traceability between the test basis documents and the test cases
Test implementation and execution
Test implementation
Identify and create specific test data
Develop and prioritize test procedures
If automation is to occur, prepare test harnesses and write automated test scripts
Verify that the test environment has been set up correctly
Verify and update the bi-directional traceability prepared previously
Finalize, implement and prioritize the test cases
Test execution
Execute the test procedures manually or automatically
Compare actual results with expected results
Log the outcome of test execution
l Analyze the incidents in order to establish their cause
Regression tesing
Evaluating exit criteria and reporting
Check the test logs against the exit criteria
Assess if more tests are needed or if the exit criteria specified should be changed
Write a test summary report for stakeholders
Test closure activities
Check delivered, missed deliverables
Ensure resolved incidents and
log change requests for any remain open
Document the acceptance of the system,
Finalize and archive testware. Handover testware (if necessary)
Analyze lessons learned
Use the information gathered to improve test maturity
Psychology of Testing
Traits
Curiosity
Professional pessimism
A critical eye
Attention to detail
Experience
Good communication skills
Finding defects to gain confidence -> testing role
Independence of testing
Tested by work product developer -> low-level independence
Tested by some higher level in developement team -> still a low-level independence
Tested by other organizational group (independent test team or tester) -> high-level of independence
Tested by other org or company (has been certified) -> highest level of independence
Basics for good communication
Be teammates, not just colleagues
Everyone has pride in their works
Put yourself into those
work product stakeholders
Cognitive dissonance. Ensure that they have understood before continuing. Admit what they did before announce the bad things.
Code of ethics
PUBLIC
CLIENT AND EMPLOYER
PRODUCT
JUDGMENT
MANAGEMENT
PROFESSION
COLLEAGUES
SELF
CHAPTER 4: Test Design Techniques
Test development process
Formality
Formal
Documented
Well controlled
Depend on budget, culture, other factors
Informal
Usually no document
Individual working
Test analysis
Base on test basis
Define test conditions
Many test design tecniques
Must have traceability
Horizontal
Vertical
Template
Test design
Specifying test cases
Template
Test implementation
specifying test procedures
or scripts
Produce test scripts
Put into Test execution
schedule
Template
Terms
Test case
specification
Test design
technique
Traceability
Horizontal
traceability
Vertical
traceability
Test script
Test execution
schedule
Experience-based
Equivalence
partitioning
Boundary value analysis
Decision table
testing
Decision table
State transition
testing
State diagram
State table
Use case testing
Test coverage
Decision coverage
Fault attack
(attack)
Exploratory testing
Categories
Static testing techniques
Dynamic
Specification-based (black-box)
When a specification exists
Structure-based (white-box)
All test levels
Experience-based
Time pressure
No specification
Good if tester is professional
SPECIFICATION-BASED OR BLACK-BOX
TECHNIQUES
Equivalence partitioning
Boundary value analysis
Decision table testing
State transition testing
Use case testing
STRUCTURE-BASED OR WHITE-BOX
TECHNIQUES
Measure
coverage and design tests
Test coverage
Statement coverage and
statement testing
Decision coverage and
decision testing
EXPERIENCE-BASED TECHNIQUES
Error guessing and fault attacks
Exploratory testing
CHOOSING TEST TECHNIQUES
Factors
Models used
Tester knowledge/experience
Likely defects
Test objective
Documentation
Life cycle model
Risk
Customer/contractual requirements
Type of system
Regulatory requirements
Time and budget
CHAPTER 5: Test Managment
TEST ORGANIZATION
Independent and integrated
Indenpendent
Test only
Not worry about the outcome
Not working with dev
Integrated
Working in a team with dev
Test leader
planning
monitoring
control
devise test objectives
/test
strategies and test plans
Tester
contribute to test plans
analyzing
reviewing
assessing requirements and design specifications
involved in design
setup
execute/ log the result
automate / monitoring
Skills
Application or business domain
Technology
Testing
Terms
Test management
Test manager (test
leader)
failure rate
defect density
configuration
management
Configuration control
(version control)
product risk
risk-based testing
Project risks
incident
management
incident logging
Defect report
Defect detection
percentage
incident report
priority
severity
root cause
TEST PLANNING AND ESTIMATION
The purpose and substance of test plans
guides our thinking
remember the
important challenges
vehicles for communicating
with other members
manage change
baseline
Planning test
Template
Questions
Test strategies
Entry criteria
Exit criteria
What involves and how much it costs
Estimation techniques
Work with people, expertise
Analyze from past project, from available data
Factors affecting test effort
Document
Complexity
Importance
Size changing
Tools
Time pressure
Dev life cycle
People
approaches and strategies
Analytical
Risk-based
Requirement-based
Model-based
Diagram
Procedure
Methodical
checklist
pre-planned
Process- or standard-compliant:
Dynamic
Consultative or directed
Regression-averse
Automated
Success factors
Risks
Skills
Objectives
Regulations
Product
Business
TEST PROGRESS MONITORING
AND CONTROL
Monitoring the progress of test activities
feedback
Results
Measure
Data for measure test efforts
Test log template
Metrics
completion
test coverage
status
economics of testing
Reporting test status
Test objective
Approaches
Effectiveness
Test control
Test summary report template
CONFIGURATION MANAGEMENT
Version
Changes
Tracking
TEST ITEM
TRANSMITTAL REPORT TEMPLATE
RISK AND TESTING
Risks and levels of risk
Product risks/quality risks
Risk-based testing
Early
Reduce risk
Find likelyhood
Priority/impact
Project risks
Lack of staff
deadlines
changing
solutions
Mitigate
Contingency
Transfer
Ignore
risk management
Assess
Analyze
Test
Manage
Report
INCIDENT MANAGEMENT
Incident reports
in\cident logging/defect reporting
DDP
report event
priority
serverity
root cause
Template
Life cycle
Reported
Opened
Rejected
Deferred
Reopend
Assigned
Fixed
Closed
CHAPTER 2: Testing throughout the software life cycle
Software development models
V-Models
4 levels
Component testing
Integration testing
System testing
Acceptance testing
Early testing principle
Parallel with developement activities
Iterative life cycles
Many smaller self-contained
life cycle phases
Add new functionality by each phases
Regression testing, integration testing
take more important role after each phases
RAD, RUP, Agile (Scrum), protyping,...
Rapid Application Development
Mini components
Fast delivery
Rapid response and get feedback
Changing, updating regularly
Agile development
Scrum, Extreme Programming (XP)
User/business stories instead
detailed requirements
Form new representatives (teams) each
iteration
Changes and growing are welcome
Code sharing between teams
and close inclusion of testers
Test-Driven Development
Testing is important and regular,
but short
Many benefits and also challenges
Testing within a life cycle model
For every development activity there is a corresponding testing activity
Each test level has test objectives specific to that level
Analysis and design of test should be made parallel with development
Testers should be involved in reviewing documents soon
Terms
Verification
Validation
V-Model
Test level
COTS
Incremental
development model
Iterative development
model
Agile software
development
Agile manifesto
Compoenent testing
Stub
Driver (test driver)
Test-driven
development
Integration testing
System testing
System testing
Test environment (test
bed)
Acceptance testing
Maintenance
Alpha testing
Beta testing (field
testing)
Test type
Functional
testing
Black-box (specification-based) testing
Functional testing types
Performance
testing
Load testing
Stress testing
Usability testing
Maintainability
testing
Reliability testing
Portability testing
Characteristics
Blackbox test design techniques
WhiteBox testing
Code coverage
Whitebox test design techniques
Maintenance
testing
Impact analysis
Test levels
Component testing
Test individual unit, module,...
Usually made by programmer
Defects are found and fixed soon
Integration testing
Interactions between components, objects
Interactions between systems
Hard to detect cause of failure
Incremental testing
Top-down
Bottom-up
Functional incremental
System testing
Test integrated system
Test if it satisfy requirements specification,
functional or use case,...
Both functional/ nonfunctional,
blackbox, whitebox
Acceptance testing
Before release/ integrated
By users, stakeholders
Satisfy user requirements,
system requirements
Contract, regulations acceptance testing
Alpha testing and Beta testing
Test types
Functional testing
Test behaviours
Black-box testing
Done from Requirement-based
Done from Business-process-based
Techniques:
Specification-based/Experiences-based
Non-functional testing
Testing of ‘how well’ the system works
Inclusions testing (more)
Performance
Load
Stress
Usability
Maintainability
Reliability
Portability
Characteristics
Functionality
Reliability
Usability
Efficiency
Maintainability
Portability
Structural testing
White-box, Glass-box testing
Test what inside the test components
Test the code coverage
Structural-based (white-box test design techniques)
Usually in lower level like component, integration testing
Testing related to changes
Confirmation-testing (re-testing)
Regression testing
Excuted when software changes
Maintainance Testing
During software maintainance lifecycle
Often carried out when a modification, migration or
retirement of software happens.
Impact analysis and regression testing
Test the changes
Test the impact of changes
Triggers for maintainance testing
Modifications, migrations, retirement
Planned modifications
Perfective
Adaptive
Corrective
Ad-hoc corrective modifications
Require immediately solutions
Test approach in many different ways
Test prepared for specific situations
CHAPTER 3: Static techniques
Static techniques and test process
Manually/set of tools
Without execution
Type of defects
Deviations from standard
Missing requirements
Design defects
Non-maintainable code
Inconsistent interface spec
Advantages
Early feedback
Low cost for rework
Productivity increases
Exchanging, interacting
Increase awareness of quaility issues
Terms
Static testing
Dynamic testing
Informal review
Formal review
Moderator (inspection
leader)
Entry criteria
Reviewer (inspector)
Metric
Scribe
Walkthrough
Technical review
Peer review
Inspection
Static analysis
Compiler
Control flow
Data flow
Review process
Formal
Phases
Planning
Steps
Defining the review criteria
Selecting the personnel
Allocating roles
Defining the entry and exit criteria for more formal review types
Selecting which parts of documents to review
Checking entry criteria (for more formal review types).
Kick-off
Steps
Distributing documents
Explaining the objectives, process and documents to the participants.
Kick-off meeting
Preparation
Steps
Preparing for the review meeting by reviewing the document(s)
Noting potential defects, questions and comments
Logging forms
Checklist
Annotated documents
Checking rate
Review meeting
Steps
Discussing or logging
Severity
Critical
Minor
Major
Noting, making recommendations, decisions
Examining, evaluating and recording issues
Rework
Steps
Recording updated status of defects (formal)
Fixing defects found (done by author)
Follow-up
Steps
Checking that defects have been addressed
Gathering metrics
Checking exit criteria (for more formal review types)
Informal
Role and responsibilities
Moderator
Reviewer
Author
Scribe
Manager
Types of review
Walkthrough
Step by step
Done by author
Stakeholders review
Technical review
Peer review
Focus on technique
Ensure that technical concepts are
used correctly
Inspection
Most formal
Success factors for reviews
Find a ‘champion’
Pick things that really count
Pick the right techniques
Explicitly plan and track review activities
Train participants
Manage people issues
Follow the rules but keep it simple
Continuously improve process and tools
Report results
Use testers
Just do it!
STATIC ANALYSIS BY TOOLS
Compiler
Coding standards
Naming conventions
Layout
Model
Code metrics
Complexity
Cyclomatic complexity
Code structure
Control flow
Data flow
Data structure