Please enable JavaScript.
Coggle requires JavaScript to display documents.
CTFL - Chapter 4: Test Analysis and Design - Coggle Diagram
CTFL - Chapter 4: Test Analysis and Design
Categories
White-box
Know architecture, detailed design, internal structure, or the code of the test object
The test cases are
dependent
on the software is designed
Structure-based technique
Experience-based
Stakeholder's experience of similar systems and general experience of testing
Depend heavily on the tester's skills
Black-box
Behavior-based technique
Concentrate on the
inputs
and
outputs
of the test object without reference to its internal structure
The test cases are
independent
of how the software is implemented, detected gaps between the requirements and the implementation of the requirements
Black-box Test Techiniques
Boundary value analysis
Characteristics
Works hand-in-hand with equivalence partitioning technique
The minimum and maximum value of a partition are its boundary values
Types
2-value BVA
3-value BVA
Decision table testing
Characteristics
List all the input conditions at the top of the table and all the actions of the system at the bottom of the table
Combinatorial testing techniques
Coverage
The common minimum coverage standard for decision table testing is to have at least one test case per decision rule in the table
Measured as the number of decision rules tested by at least one test case, divided by the total number of decision rules, normally expressed as a percentage
Equivalence partitioning
Minimum test case = Combined Valid Partition + Invalid Partitions. Ex: Minimum for test case that password has 1-6 characters, begins with A-Z, contains [A-Z, 0,9] -> Minimum test case is 1 + 4 =
5
Example for age field take the values in the range of 20 to 50 -> Contains: (age < 20); (20 <= age <= 50); (age > 50)
Characteristics
Each value must belong to
one and only one
equivalence partition
Valid/Invalid values should be accepted/rejected by the component or system
Invalid equivalence partitions should be tested
individually
Coverage
Is measured as
the number of equivalence partitions
tested by at least
one value
, divided by
the total number of identified equivalence partitions
, normally expressed as a percentage.
Example: Password Test cases: AB36P(1, 4, 6), VERYLONG (3) => Coverage = 4/7 x 100% = 57%
State transition testing
Characteristics
Exhibit a different response depending on current conditions or previous history
Show the possible software states, how the software enters, exits, and transitions between states.
Coverage
Measured as the number of visited states divided by the total number of states & is expressed as a percentage
To achieve 100% all transitions coverage, test cases must exercise all the valid transitions & attempt to execute invalid transitions
White-box Test Techniques
Statement Testing and Coverage
Coverage
Test every "executable" statement , we call this full or 100% percent statement
The weakest white-box technique
Determined by the number of executable statements covered by executed test cases divided by the number of all executable statements in the code under test.
Branch Testing and Coverage
Characteristics
Test every possible branch in the control flow graph of a program
Coverage
Each IF statement has 2 branches representing true & false, so we multiply the total number of IF statements by 2 to get total number of branches
The number of all decision outcomes covered by test cases divided by the number of all possible decision outcomes in the code under test
The Value of Statement & Branch Testing
Statement coverage helps to find defects in code that were not exercised by other tests
Branch coverage helps to find defects in code where other tests have not taken both true and false outcomes
Statement testing may provide less coverage than branch testing
100% branch coverage guarantees 100% statement coverage, but not vice versa
The Value of White-box Testing
Used as a static testing technique (A dry run)
Review code that is not ready to run
Poor requirements but have knowledge about the structure of the software
Coverage measurement technique
Experience-based Test Techniques
Exploratory testing
Characteristics
Learn
more about the component or system
at hand
and create tests for the areas that may
need more testing
Session-based testing
The
tester's experience
to test the software without going into the cycle of writing test conditions, test cases, test procedures
Test charter
contains
test objects
to guide the testing to help to focus on the most critical areas and what kind of defects to look for
Benefit
Few or inadequate specifications
Significant time pressure
Checklist-based testing
Characteristics
Design, implement & execute tests to cover those test conditions found in a checklist
Can be generic or specialized
Generic checklists could be used for all types of software to verify any software or component properties
Specialized checklists also exist for testing databases application or testing websites
Example
Checking for image uploading
Check for image uploading with different extensions such as JPEG or BMP
Checking for image uploading path
Error guessing
Characteristics
What types of mistakes do the developers tend to make ?
Failures that have occurred in other application. "Depend on tester's exp" (Ex: Fault attack)
How the application has worked in the past ?
Collaboration-based Test Approaches
Characteristics
Prevent people from making errors that lead to defects being slipped into the test object by communicating & working together as a team
Collaborative User Story Writing
Characteristics
Small-sized, understandable chunks of business functionality
Format: as <Role>, I want <Functionality>, so that <Business Benefit>.
Written from the point of value of the user
Technique
INVEST
Negotiable
Discuss user stories with business representatives & make tradeoffs based on cost & function
Valuable
Clearly understood business benefits
Independent
The withdrawal of money from the ATM depends on the user story for login to the ATM.
Estimatable
Estimate the effort of a user story
Small
Easier to estimate & test than large ones
Tesable
Important for tracking progress
3 C's
Conversation
Having a conversation - collaborating to define the requirements & understand the value
Explain how the software will be used
Confirmation
Confirm the story is done
To confirm a story as done, the defined acceptance criteria should be tested & shown to be satisfied
Agree on the acceptance criteria that know when you are done
Card
Identify requirement, expected development, test duration, the acceptance criteria for that story
Write down the requirement instead of a heavy-weight document
Acceptance Criteria
Characteristics
Criteria should be defined in collaboration between business representatives, developers, tester
A team considers a task finished when a set of acceptance criteria has been satisfied
Part of the user story
Uses
Reach consensus among the stakeholders
Describe both positive & negative scenarios
Define the scope of the user story
Serve as a basis for the user story acceptance testing
Allow accurate planning & estimation
Scope
Quality characteristics
Performance, reliability, usability, etc.
Business rules
Activities that can only be performed in the system under certain conditions defined by outside procedures & constraints
Scenarios
A sequence of steps or actions between an external actor & the system to accomplish a specific goal or business task
External interfaces
Descriptions of the connections between the system to be developed & the outside world. Ex: user interface, interface to other systems
Functional behavior
The externally observable behavior with user actions as input operating under certain configurations
Summary
Rule-oriented
Format documents
Scenario-oriented
Acceptance Test-Driven Development (ATDD)
Characteristics
In this process, the user story is analyzed, discussed & written by developers, testers & business representatives collaboratively
Any incompleteness, ambiguities, or errors in the user story are fixed during this process
A
test-first approach
where test cases are created before implementing the user story
Test cases based on the acceptance criteria