Exams as we know them are out-dated but what should be their replacement?

By Mathew Hillier, Senior Lecturer, Monash Education Academy, Monash University.

Some argue that we should do away with examinations as a means to assessing student capability. This argument has some merit. When was the last time you faced a page of multiple choice questions as your job for the day or had to present complex information in the form of a long hand written response whilst locked in a room separated from the outside world?

Most people now make use of information, computing and communications technologies as part of exploring, solving problems, building, creating and presenting findings. Students regularly use computers to construct their response to assignment and project work and similarly Internet technology is now very much embedded in social and family life. Most professional employment contexts now expect the use of sophisticated software tools and data sources are as part of the complex problem solving tool kit. Therefore we can argue that a productive member of society and a competent practitioner engaged in problem solving must be able deploy a range of digital literacies, a high degree application, analysis, synthesis and evaluation of multiple data and information sources. Thus it appears that the paper-based exam is increasingly at odds with the modus operandi of the 21st century.

We must ask: Can institutions continue to claim that they are accrediting graduates as fit for practice in the 21st century when a large portion of their summative assessment involves students sitting isolated, feverously scribbling away on paper? We have observed that this realisation is increasingly in the minds of university leaders, but uncertainly remains as to what to do about it.

The use of alternative assessment modalities and techniques are certainly worth considering. Multiple factors need to be considered if the alternatives are to be acceptable and doable within constraints and expectations of contemporary higher education. Three key concerns of pedagogic efficacy/authenticity, integrity (security) of the assessment process and scalability/affordability have often been in conflict when it comes to assessment design. Authentic learning and assessment has been deployed in work integrated learning in a manner that also carries a relatively high degree of assurance (i.e. identify verification and contribution attribution), however we run into practical limits when scaling these experiences to a large number of students. On-campus laboratories are another example where authenticity can be high but scalability can be limited. Complex project work carried out over an extended period of time beyond the classroom can be designed to press higher order capabilities and include a good dose of authenticity. Unsupervised assessments are also reasonably scalable (at least more so than work integrated learning experiences) and so are frequently used in higher education. However because student work is carried out un-supervised there is a risk that the task may have been ‘outsourced’ to someone else threatening the integrity of the education system. The traditional response has been to set a written, time limited, invigilated examination to balance the unsupervised portion. Well run examinations are generally regarded by most stakeholders as having a high degree of process integrity and are scalable to a large numbers of students with Universities in Australasia each running in the region of 100,000 to 300,000 exam sittings annually (Roach, 2017). Yet we return to the previous argument that, examinations as we tend to know them, as pen-on-paper affairs, are increasingly divergent from the contemporary work and life experience.  Therefore, it would appear that assessment designers are currently stuck with two out of three with respect to authenticity, integrity and scalability.

To counter an abolitionist stance, there are a number of reasons why the invigilated examination is not going away any time soon. This includes social and governmental expectations, the need to rank and meet accreditation requirements, the desire to test a student’s metal under pressure, to ensue that they ‘know their stuff’ and are not just paraphrasing the result of a web search. The relatively high degree of control and assurance that the student did the work in light of increased awareness of contract cheating is an important consideration. Exams also provide the ability to assess a large number of students in a short time at the end of study modules. So it looks like we are probably stuck with exams for some time to come. Just as we have made great strides in improving the student learning experience in many areas of education we can certainly do better in the exam room. We argue that a more pragmatic approach is re-think what is possible within the exam room context given the availability of modern ICTs.

Enter – the ‘Transforming Exams’ project, funded by the Australian Government Department of Education and Training. A primary motivation has been enabling authentic assessment within the invigilated exam room context. We argue that for technology to be deployed in the exam room, it must enable pedagogical progress to be made where ultimately this means the redefinition (Puentedura, 2006) of assessment tasks, towards targeting higher order thinking (Krathwhol, 2002), rather than just replicate paper based question formats in a digital form. This means providing a holistic digital ‘authentic assessment’ (Crisp, 2009) environment to all candidates such that it enables complex tasks to be set and for students to be able to demonstrate their capabilities via complex, constructed responses using contemporary e-tools of the trade. In our work we frame the term ‘e-Exam’ (eExam) to specifically refer to a “timed, supervised, summative assessment conducted using each candidate’s own computer running a standardised operating system”. This is in contrast to many existing computerised testing systems that use single applications or web pages that provide a limited ‘form’ based environment for questions and responses. Such approaches add little to the design of exam based assessments because the rich affordances of complex software as mind tools (Jonassen, 1991) for problem solving is not available to task designers or students. We argue that by providing a whole system environment with a range of sophisticated software applications, this opens up the ‘pedagogical landscape’ of the exam room.

Our authenticity first approach has not ignored assessment integrity, reliability and scalability but these have been built around the core ideal of authenticity. We have previously discussed a set of requirements for an approach to e-Exams that has been presented at conferences and other forms circa 2013 (e.g. Hillier & Fluck, 2013). Our work in the intervening years has sought to deliver on these requirements informed by significant stakeholder groups including students, academics and examination administrators that has resulted in what we believe is a very different approach.

We have indeed observed that education institutions are increasingly cognisant that there must be change in the exam room – but the question still remains for many – how to move forward? Mistakes made could lock an institution into a pedagogic bind, rob them of access to rich rivers of performance data and send them down a technological cul-de-sac. The choices made in how examinations are modernised will lead to substantial investment over the medium term and will certainly establish the pedagogic framework for higher stakes assessment for years to come. To help institutions make an informed decision, the project team have established a national roadshow (now in progress) and will be hosting an e-Exams Symposium on Sat 24 November 2018 in Melbourne. This is just prior to the ASCILITE conference.

Further information on our e-Exams work is available from our website http://transformingexams.com.
Further information on our e-Exams Symposium and registration is available http://eexamsymposium.eventbrite.com.au

e-Exam Symposium: Saturday 24 Nov 2018, Melbourne

Meet and discuss e-Exams and network with colleagues from across Australasia in this full day Symposium (Monash University Caulfield campus) where we will examine research findings across a range of e-Exams issues including a case for authentic e-exams, the pedagogy of authentic e-exams, student’s perspectives and experiences of doing e-exams, policy and equity dimensions, the logistics and the technology of enabling authentic e-exams.

Prof Geoff Crisp (PVC Education, UNSW) will chair the proceedings. Geoff is well known for his work on authentic e-assessment in higher education.

We have secured two international guest keynote speakers:

  • Head of e-Learning Service at Alpen-Adria University, Klagenfurt, Austria, a university with 40% uptake of e-exams.
  • Senior specialist in e-exams at the national Matriculation Examinations Board of Finland speaking about the roll out of their national e-exams project.

There is a small at-cost registration of AUD $195 for the symposium.  Register now at: http://ta.vu/eexam_symposium_reg.

References

Crisp, G. (2009) Towards Authentic e-Assessment Tasks (Vol. 2009, pp. 1585–1590). Presented at the EdMedia: World Conference on Educational Media and Technology, Honolulu, HI, USA. http://www.editlib.org/p/31689/

Hillier, M. & Fluck, A. (2013) Arguing again for e-exams in high stakes examinations. In H. Carter, M. Gosper, & J. Hedberg (Eds.), Electric Dreams (pp. 385–396). Macquarie University. Retrieved from http://www.ascilite.org.au/conferences/sydney13/program/papers/Hillier.pdf

Jonassen, D. H. (1991) What are cognitive tools? In P. A. M. Kommers, D. H. Jonassen, & J. T. Mayes (Eds.), Cognitive tools for learning (pp. 1–6). Berlin, Germany: Springer-Verlag. https://doi.org/10.1007/978-3-642-77222-1_1

Krathwohl, D. R. (2002) A Revision of Bloom’s Taxonomy: An Overview. Theory Into Practice, 41(4), 212–218. https://doi.org/10.1207/s15430421tip4104_2

Puentedura, R. R. (2006) Transformation, Technology, and Education, Hippasus. Retrieved February 26, 2018, from http://hippasus.com/resources/tte/

Roach, A. (2017) Exams Network 2017 Benchmarking Survey, Unpublished report.

 

0 0 votes
Article Rating
Subscribe
Notify of
guest

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments