Drawing on the education, information technology and analytics literature, we argue that a combination of e-exams and learning analytics could be a powerful tool in enhancing our understanding of student behaviour, strategies and performance in examinations.
The rapid rise in the use of learning analytics has been highlighted by the work of Dawson and McWilliam (2008) and the recent report by Siemens, Dawson and Lynch (2013) that is based on the 'big data' available from a plethora of sources within the institution. The potential for insight by drawing in data from student information and learning management systems mean that a great deal can be learnt about the way students are engaging and performing on formative assessments during the semester. A wide range of digital learning activities are now available for formative assessment that includes digital patients (Newby et al. 2011), virtual microscopy (Kumar et al. 2009), simulated conversations (Nelson & Dawson 2014), virtual history excursions (Matthews & Agutter 2014), simulated practicums (Gregory, et al. 2013) and virtual immersive foreign language learning (Grant et al 2013) that can provide data for analysis. However, when it comes to the typical paper based exam there is very little in the way of data points that can contribute to such analysis. Further it is currently rare for students to receive comprehensive feedback on how they have performed in examinations beyond a 'pass or fail' and similarly very little analytics of the exam questions themselves is performed. E-exams offer a way 'fill the data gap' with respect to our knowledge of student performance and behaviour in high stakes assessments.
The potential for learning analytics and computerised high stakes exams is currently an untapped resource in assessing, evidencing and evaluating graduate capabilities. The contemporary workplace and social sphere is characterised by the high availability of information and a sophisticated range of ICT tools. Contemporary ICT enhanced formative learning activity does provide a means approximate the 'real world' however we are still left with the problem of authentication. The paper-based, high stakes exam remains as the primary means by which universities can be confident that the work submitted belongs to student being assessed. However a paper-based medium no longer reflects the problem solving environment that graduates will encounter in professional practice. This results in a disconnect between the desired set of attributes graduates will be expected to hold and our ability to validate and assess their capabilities. Further, the use of learning analytics in higher education is increasing as a means for understanding student learning behaviour, progress, engagement and success (or otherwise) as well as a means to improving learning delivery. However, the paper-based exam represents a significant information gap with regard to the data required to complete a more comprehensive picture.