Skip menu

Developing a tool to map assessment literacy

Thursday 4 July: Conference day two, 4:45pm – 5:15pm parallel session

 

Venue

Room 9 – 3030-G23 MLT1

 

Presenters

Dr Nirma Samarawickrema
Monash University, Australia
nirma.samarawickrema@monash.edu

Ester Villvanathan
Monash University, Australia

 

Background/context

Assessment literacy is defined as students’ understanding of the purpose and process of assessment, and having the ability to judge/evaluate their response to assessments, to identify strengths and weaknesses and thereby being able to device strategies to improve their work1. Students’ engagement with activities that foster assessment literacy enhances their learning potential2. Monash University’s assessment framework3 has three overlapping domains:

  1. knowledge and understanding;
  2. skills and competencies; and
  3. attributes and professionalism.

This ensures that students develop and demonstrate Monash Graduate Attributes necessary for work and lifelong learning.

In this ongoing study, we investigate evidencing of Monash’s assessment framework in our courses through an evaluation of course material and student/staff perspectives. This presentation demonstrates a tool we developed for this purpose.

 

Initiative

Our tool will map assessments of core units in several courses of the university to determine the extent to which these evidence the Monash Graduate Attributes. The process is composed of two parts. Firstly, does a task satisfy key aspects of an assessment (align with learning outcomes, evidence their achievement, have comprehensible description of process, clear assessment criteria, and provision of relevant feedback to improve performance)? Secondly, does the task fall within Monash’s assessment framework of the aforementioned overlapping domains and three themes, assessment for preparation, learning and demonstration?

 

Intended outcome

The tool was systematically trialled in one course by applying it to all assessments in all its core units. Surveys and interview were conducted on how teachers fostered, and students perceived assessment literacy in the same course. Emerging data suggest the tool is effective in mapping the developing sophistication of assessments across advancing AQF levels. It highlighted the assessment range in the course, identified gaps and areas for improvement. Our tool revealed that quizzes were common assessments, favouring assessing content knowledge, while assessments that engage students in making evaluative judgements (self and peer-assessment) were fewer, highlighting areas that need strengthening.

 

References

Nicol D. 2009. Assessment for learner self-regulation: Enhancing achievement in the first year using learning technologies. Assessment & Evaluation in Higher Education 34(3): 335-352. https://www.tandfonline.com/doi/abs/10.1080/02602930802255139
Smith, C.D., K. Worsfold, L. Davies, R. Fisher and R. McPhail. 2013. “Assessment literacy and student learning: The case for explicitly developing students’ assessment literacy”. Assessment and Evaluation in Higher Education 38(1):44-60. https://www.tandfonline.com/doi/abs/10.1080/02602938.2011.598636
Monash Assessment Vision; Better Teaching, Better Learning. Monash University. http://www.intranet.monash/learningandteaching/learningandteachingquality/assessment-vision

 

Presentation topic

Students – Future Graduates

Print Friendly, PDF & Email