How to get your Assessment Cooking with G.A.S.!

marc

I was recently invited to write an article about developing effective assessments. In fact, I was asked to discuss how to ‘develop assessment tools that even an auditor would be proud of’. Now, it is great that an auditor loves your work, but ultimately the true test of a successful assessment tool is whether it is a good fit for the benchmark standards, the candidate/s and the workplace the competency is to be applied. Moreover, just because it looks good and ticks all the auditor’s boxes, this doesn’t necessarily make it right for all occasions.

According to the TAE10 Training Package v2.0 (p101) “There is no set format or process for the design, production or development of assessment tools”, however there are key things that successful ones share. Therefore, I have sought to identify some of the best practice features that support the development robust assessment tools.

Essentially, there are six steps in the assessment design process:

Step 1: Familiarise yourself with benchmark standards
Step 2: Identify evidence requirements
Step 3: Select appropriate assessment methods
Step 4: Develop assessment tool/s
Step 5: Collect Evidence
Step 6: Trial and refine tools

Traditional assessment was focused on the testing part, rather than the learning part. In this way we were trying to catch the student failing, rather than capturing them in moments of success. However, these days we concentrate much more on verifying success (competence) than confirming failure and as such, it is critical that we develop evidence-gathering tools that help us to best support a candidate’s claim for competence.

Like all professionals, assessors need to understand the capacity of the tools they use, and be able to adapt them to meet the particular requirements of the task and expectations of that task in a workplace setting. Rather than simply regurgitating the old mantra of ‘we have always done it this way’, they should be continuously striving to enhance the tools to meet the changing needs of their unique settings.

So how does it all fit together? The relationship between the 6 steps in the process may look something like the information below: (Taken from the SFL10 Unit of Competency SFLDEC201A Assemble floristry products)

  • Benchmark (The Standards against which a candidate is assessed) e.g. Assemble hand tied flower and plant materials.
  • Evidence Requirements (The information that when matched to the benchmarks, show that a candidate is competent) e.g. Ability to construct multiple and diverse fundamental floristry products.
  • Assessment Methods (Techniques used to gather different types of evidence) e.g. Observation of Actual Performance, Production of Items
  • Assessment Tools (The instruments and instructions for gathering and interpreting evidence) e.g. Observation checklist and review of portfolios of evidence and third-party workplace reports of on-the-job performance by the candidate.
  • Evidence Produced (The information on which the assessment judgement is made) e.g. Demonstration of multiple hand tied flower arrangements
  • Trialing and Refining (Checking fitness for purpose and seeking improvement options) e.g. Trialing your tools before they are used formally with candidates could enable more user- friendliness in the format and clarity in the instructions.

So what is the G.A.S. all about? Put simply, Gathering appropriate evidence Assures Success! If you continue to follow the six simple steps of assessment tool design, then your assessment practice will be more effective, you will gather better evidence, there will be more consistent outcomes for the workplace application and you will be cooking with gas as an assessor!

Marc Ratcliffe
CEO, MRWED Group
Follow Me on twitter: @MRWED_CEO