Please enable JavaScript.
Coggle requires JavaScript to display documents.
Developing Assessment Tools, Guide for RTO, 2015 - Coggle Diagram
Developing Assessment Tools
Introduction
Assessment
It means the process of collecting evidence and making judgements on whether competency has been achieved, to confirm that an individual can perform to the standard required in the workplace, as specified in a training package or a vocational education and training (VET) accredited course
An Assessment System
It is a coordinated set of documented policies and procedures (including assessment materials and tools) that ensure assessments are consistent and are based on the Principles of Assessment and the Rules of Evidence.
An Assessment Tool
tasks to be administered to the student
an outline of evidence to be gathered from the candidate
evidence criteria used to judge the quality of performance
context and conditions of assessment
Developing Assessment Tools
Planning
Design and Development
Quality Checks
Developing Assessment Tools
Step 2: Design & Development
1. Context and conditions of assessment
Clarify the target group and purpose of the tools
Characteristics of the learner cohort
Other learner needs that require reasonable adjustment
Conditions
Equipment or material requirements
Contingencies
Specifications
Physical conditions
Relationships with team members and supervisors
Relationships with clients/customers
Timeframes for completion.
2. Tasks to be administered to the student
Outline the task(s) through which a learner can demonstrate
competency
The task information should clarify if assessment is conducted in real-time or in simulation. The learner clearly needs to understand the tasks
Prompt the learner to say, do, write or create something.
3. An outline of the evidence to be gathered from the candidate
Explain to them what evidence
they need to provide in response to the tasks.
What to include as evidence?
How to submit the evidence?
How to present the evidence?
Observation on behaviour/ skill
Obervation checklist
4. Evidence criteria used to judge the quality of performance
Assessment decision-making rules
Make judgements about whether competency has been achieved
Evidence criteria
Checking evidence quality (i.e. the rules of evidence)
Judging how well the learner performed according to the standard expected, and
Collating evidence from multiple sources to make an overall judgement
5. Administration, recording and reporting requirements
Retain sufficient data to be able to reissue AQF certification documentation for a period of 30 years.
Learners need to be informed of the administration, recording and
reporting requirements related to that assessment tool.
Ensure that the retained evidence has enough details to demonstrate the judgements.
Each assessment tool should require an assessor to provide feedback to the learner. (Shows fairness and able to justify the judgements.)
Step 3: Quality Checks
Industry Consultation
Clarity
Content accuracy
Relevance
Appropriateness of language
Other trainers and assessors
Adress all requirements
Appropriate level of difficulty
Effective collection of evidence
Clear instructions
Trial run
Cost-effectiveness
Engaging
Valid and reliable evidence
Step 1: Planning
Consider how a learner will
Demonstrate the task
Know what they need to do to complete task, and why
Demonstrate ability to perform task in diff. context & environment
Component
To identify requirements a learner needs to demonstrate competency
Performance evidence
Knowledge evidence
Performance Criteria
Assessment conditions
Elements
May require multiple and varied assessment methods
Industry Consultation
Assist in identify appropriate assessment method
Ensure assessment tool align to current industry methods, techs, products and performance expectations
Assessment methods
Consider
Learner cohort
Learner's individual needs
General needs of cohort
Portfolio as evidence for learners who are upskilling/employed
Who?
Collect evidence
To guide instructions that are required to accompany the assessment task
Remember! Assessor determines competence, not evidence collector
E.g. Teachers collect tests, students compile portfolio, reflection
Where?
Conduct assessment
Based on requirements of training/course
At workplace, at labs, simulated conditions, etc.
Types
Direct observation
Product based method
Portfolio
Questioning
Third-party evidence
Summary
ensure the assessment is valid, reliable and
flexible.
understand the capacity of the tools they use and adapt these tools to meet their requirements.
Design and development—How does each component of an assessment tool come together?
Quality Checks—How to review a tool prior to implementation?
Planning—What are the assessment requirements of the training package / accredited course? What does
feedback from industry identify? What assessment methods are most appropriate for your learners?
Guide for RTO, 2015
Clause 1.8
Principles of assessment
Fairness
Flexibility
Validity
Reliabilit
Rules of evidence
Validity
Sufficiency
Authenticity
Currency
Clause 1.11
Independent validator
Industry expert
Clause 1.9
Ongoing sytematic validation
when validation occur
which training product is focus of validation
whol lead and participate in validation
how the outcome of validation is documented and will be acted upon
Clause 1.10
Frequency of validation exercise
Once every five years
Within the first three years
Consider relative risk of training products
Clause 1.12
Recognizes prior learning