VETNexus has been doing quite a bit of assessment validation and review recently. This has been due to a combination of the need for Registered Training Organisations (RTOs) to have validated 100% of their training products prior to 1 April 2020, others preparing for audits and also as part of ongoing monitoring of assessment practices. We do see a huge range of variance when it comes to levels of compliance and quality of assessment – one end of the spectrum being validations that seem like we haven’t done much reviewing because there isn’t much to comment on (other than the requirements being fully met) through to the other end of the spectrum where recommendation reports are many pages long. This has prompted me to put together a list of a couple of things to consider when it comes to developing quality, compliant assessment tools.
Consideration of the Australian Qualifications Framework (AQF)
Regardless of the ongoing debate about whether units have level, the unit is the unit, and so forth, we do need to consider the level of expectation assigned to the qualification we are delivering and assessing in relation to the type of evidence we are gathering. The AQF describes that graduates at levels 1 and 2 are working with basic factual, technical and procedural knowledge – we will want to see that they can list the steps they need to follow to complete a task, and actually follow those steps. When it comes to higher level qualifications, listing the steps isn’t going to be sufficient. For example, in completing the unit of competency TAEASS502 Design and develop assessment tools, knowledge of the principles of assessment isn’t going to be demonstrated by a ‘List and Describe’ question – this is a copy and paste exercise from the learner guide or Standards for RTOs 2015. What we are looking for is the ability to analyse information such as looking at a case study and identifying how the principles have been applied, or describing how they are using them in the development of their own tools.
Key point: think about the types of questions that are being asked at the various levels to ensure they are appropriate to the overall criteria for the qualification level.
Coverage of the unit requirements
What needs to be covered in the assessment? Easy – everything in the unit of competency and the assessment requirements. This is referenced in the Standards for RTOs 2015 in Clause 1.8, Table 1.8-1 under Validity:
- “assessment against the unit/s of competency and the associated assessment requirements covers the broad range of skills and knowledge that are essential to competent performance”
- “judgement of competence is based on evidence of learner performance that is aligned to the unit/s of competency and associated assessment requirements.”
These two dot points from the Standards therefore show us that we need to assess:
- Elements/Performance Criteria;
- Foundation Skills;
- Performance Evidence;
- Knowledge Evidence; and
- Meet the Assessment Conditions.
While the Standards don’t refer to requiring a mapping document, this tool services two purposes:
- quality assurance for the RTO to know they have sufficiently covered requirements;
- evidence for any third party (such as an auditor or consultant) of how the requirements of the unit have been met and makes for an easy reference guide.
There are more advantages to quality mapping processes but that is an article in itself.
Key point: ensure you have a thorough mapping document and full coverage of all components of the unit of competency and associated assessment requirements.
A unit of competency uses generic language to describe the standards to be met to award competence. While there will be some areas of units that need to be changed/updated to meet current industry requirements this is not going to be specific and it is up to the RTO to make sure there is industry consultation occurring around the content being delivered and the way the unit is assessed. An example may be that the unit states that current technology is used in the hospitality industry. The RTO must then consult with hospitality organisations about what they are using, ensure the RTO’s equipment is up to standard and they are then training students in current practices.
Key point: ensure regular engagement with industry and incorporate suggestions where possible – ensure this is documented.
Instructions for the student, instructions for the assessor, instructions for third parties and workplace supervisors, instructions for the RTO (maybe not in the tool but in the assessment system policies and procedures). Make it clear who has to do what – don’t leave people guessing. Once assessment has been developed have someone who is not familiar with the assessment look over it and provide feedback regarding how they interpret it. Also, keep it concise and streamlined. Have all information regarding a task in the one place for the students. They shouldn’t have to refer to multiple places to find out what they need to do to complete a task.
Key point: make sure other people understand the instructions and keep it simple and easy to follow.
While there are lots more items to consider, this is enough for one article! It isn’t easy to write quality, compliant assessment tools. It takes a lot of practice. An important part of the process is getting feedback from others and doing this prior to using the tools with students.