ISTQB Advanced Test Analyst
- Testing Process (300 min) ❗
- Test Management: Responsibilities for the Test Analyst (90 min) 🔸
- Test Techniques (825 min) ‼
- Testing Software Quality Characteristics (120 min) ⚠
- Reviews (165 min) ⚠
- Test Tools (45 min) 🔸
1.2 Testing in the Software Development Lifecycle
TA-1.2.1 (K2) Explain how and why the timing and level of involvement for the Test Analyst varies when working with different lifecycle models
1.3 Test Monitoring, Planning and Control
1.4 Test Analysis
1.5 Test Design
1.6 Test Implementation
1.7 Test Execution
1.8 Evaluating Exit Criteria and Reporting
1.9 Test Closure Activities
TA-1.9.1 (k2) Provide examples of work products that should be delivered by the Test Analyst during test closure activities
TA-1.3.1 (K2) Summarize the activities performed by the Test Analyst in support of planning and controlling the testing
TA-1.4.1 (K4) Analyze a given scenario, including a project description and lifecycle model, to determine appropriate tasks for the Test Analyst during the analysis and design phases
TA-1.5.1 (K2) Explain why test conditions should be understood by the stakeholders
TA-1.5.2 (K4) Analyze a project scenario to determine the most appropriate use for low-level (concrete) and high-level (logical) test cases
TA-1.6.1 (K2) Describe the typical exit criteria for test analysis and test design and explain how meeting those criteria affect the test implementation effort
TA-1.7.1 (K3) For a given scenario, determine the steps and considerations that should be taken when executing tests
TA-1.8.1 (K2) Explain why accurate test case execution status information is important
2.2 Test Progress Monitoring and Control
2.3 Distributted, Outsourced and Insourced Testing
2.4 The test Analyst's Tasks in Risk-Based Testing
TA-2.2.1 (K2) Explain the types of information that must be tracked during testing to enable adequate monitoring and controlling of the project
TA-2.3.1 (K2) Provide examples of good communication practices when working in a 24-hour testing environment
TA-2.4.1 (K3) For a given project situation, participate in risk identification, perform risk assessment and propose appropriate risk mitigation
3.2 Specification-Based Techniques
3.3 Defect-Based Techniques
3.4 Experience-Based Techniques
TA-3.2.1 (K2) Explain the use of cause-effect graphs
TA-3.2.2 (K3) Write test cases from a given specification item by applying the equivalence partitioning test design technique to achieve a defined level of coverage
TA-3.2.3 (K3) Write teste cases from a given specification item by appying the doundary value analysis test design technique to achieve a defined level of coverage
TA-3.2.4 (K3) Write test cases from a given specification item by applying the decision table test design technique to achieve a defined level of coverage
TA-3.2.5 (K3) Write test cases from a given specification item by applying the state transition test design technique to achieve a defined level of coverage
TA-3.2.6 (K3) Write test cases from a given specification item by applying the pairwise test design technique to achieve a defined level of coverage
TA-3.2.7 (K3) Write test cases from a given specification item by applying the classification tree test design technique to achieve a defined level of coverage
TA-3.2.8 (K3) Write test cases from a given specification item by applying the use case test design technique to achieve a defined level of coverage
TA-3.2.9 (K2) Explain how user stories are used to guide testing in an Agile project
TA-3.2.10 (K3) Write test cases from a given specification item by applying the domain analysis test design technique to achieve a defined level of coverage
TA-3.2.11 (K4) Analyze a system, or its requirement specification, in order to determine likely types of defects to be found and select the appropriate specification-based technique(s)
TA-3.3.1 (K2) Describe the application of defect-based testing techniques and differentiate their use from specification based techniques
TA-3.3.2 (K4) Analyze a given defect taxonomy for applicability in a given situation using criteria for a good taxonomy
TA-3.4.1 (K2) Explain the principles fo experience-based techniques, and the benefits and drawbacks compared to specification-based and defect-based techniques
TA-3.4.2 (K3) For a given scenarion, pecify exploratory tests and explain how the results can be reported
TA-3.4.3 (K4) For a given project situation, determine which specification-based, defect-based or experience-based technique should be applied to achieve specific goals
4.2 Quality Characteristics for Business Domain Testing
TA-4.2.1 (K2) Explain by example what testing techniques are appropriate to test accuracy, suitability, interoperability and compliance characteristics
TA-4.2.2 (K2) For accuracy, suitability and interoperability characteristics, define the typical defects to be targeted
TA-4.2.3 (K2) For the accuracy, suitability and interoperability characteristics, define when the characteristics should be tested in the lifecylce
TA-4.2.4 (K4) For a given project context, outline the approaches that would be suitable to verify and validate both the implementation of the usability requirements and the fulfilment of the user's expectations
5.1 Introduction
TA-5.1.1 (K2) Explain why review preparation is important for the Test Analyst
5.2 Using Checklists in Reviews
TA-5.2.1 (K4) Analyze a use case or user interface and identify problems according to checklist information provided in the syllabus
TA-5.2.2 (K4) Analyze a requirements specification or user story and identify problems according to checklist information provided in the syllabus
- Defect Management (120 min) ⚠
6.2 When Can a Defect be Detected?
TA-6.2.1 (K2) Explain how phase containment can reduce costs
6.3 Defect Report Fields
TA-6.3.1 (K2) Explain the information that may be needed when documenting a non-functional defect
6.4 Defect Classification
TA-6.4.1 (K4) Identify, gather and record classification information for a given defect
6.5 Root Cause Analysis
TA-6.5.1 (K2) Explain the purpose of root cause analysis
7.2 Test Tools and Automation
TA-7.2.1 (K2) Explain the benefits of using test data preparation tools, test design tools and test execution tools
TA-7.2.2 (K2) Explain the Test Analyst's role in keyword-driven automation
TA-7.2.3 (K2) Explain the steps for troubleshooting an automated test execution failure