By Dr Danny Liu, SRES team (University of Sydney)
At the University of Sydney, in 2012 two academics in the Faculty of Science started tinkering at an alternative, practical approach to learning analytics. Now in 2017, our learning analytics platform is in use by over 300 academics and professional staff across 17 departments in over 100 units of study, reaching over 20,000 students, and is being piloted at four other Australian universities. Its spread has been mainly due to academics seeing and hearing colleagues talk about its impact on their students and their teaching. What kind of learning analytics are academics so passionate about?
Mark Nichols recently wrote an insightful TELall blog article on keeping analytics 'do-able': "Start with the data you already have, and the support functions you already have; ensure support functions have usable access to data, and systematise data use." Our approach is exactly this, but we recognise that teachers are the best people to know what data they want but rarely have the data that they actually want in a place and form where it can actually be used. Typically, data that are used (or at least warehoused) for learning analytics include demographics, academic background, LMS clickstreams, library usage, and even WiFi logs.
But these data are also typically not what teachers need nor want when it comes to figuring out whether their students are engaging and succeeding in their course. Teachers want to know if their students are turning up, what their tutors think about them, how they are working in class, how they are performing in assessments, and what points of feedback they need. Armed with these data, teachers are yearning to provide personalised learning and support to their (often ballooning) cohorts, using data that they select, to target support and feedback how they see fit. Learning analytics was canonically defined as "the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs". Who better to understand and optimise learning than the teachers who design the environments and interact with their students at the chalkface?
At least that's what we've seen and heard from the teachers who are using our platform, the Student Relationship Engagement System (SRES). This mouthful of a name emphasises that this learning analytics platform is about empowering teachers to foster positive relationships with their students in order to enhance engagement and learning. The impact of this is supported by a lot of literature. The SRES does this by helping teachers to be in control of the whole data lifecycle – from getting meaningful engagement and success data in, analysing it, and giving them the ability to personalise, customise, and target support and feedback at scale based on those data.
A case in point: academics in Chemistry built a mobile-friendly grading interface in the SRES so that tutors could grade and provide rapid feedback to students in class on their mobile devices (image, A). These data would usually be recorded on scraps of paper, or entirely foregone because it was too difficult. While this was happening, data about the student in the SRES could be surfaced via the web app so tutors could ensure students knew safety protocols before starting experiments and protect student privacy by supporting individual conversations. Armed with data collected live from the session, the coordinator could pre-compose very customised messages for students based on the grades and feedback saved by the tutor (image, B). The coordinator can use each student's name, drop in data from the SRES database, and even conditionally show or hide parts of the message depending on the data. Students would then receive timely emails that are totally personalised and targeted to their learning needs (image, C).
Because the SRES is so customisable, academics have surprised us with the creative ways in which they collect and use data that they consider important for their students and their learning. Many use it to capture attendance, using the SRES web app and a mobile/tablet's camera to scan student cards, or an electronic roll within the SRES. When interacting with students in class, a tutor can bring up their engagement and success data from within the SRES web app and use that to inform conversation. A number also use it for grading clinical assessments or in-class presentations by building a rubric and comment fields in the SRES web app. Some use the visualisation features to build customised dashboards of class performance. Most use the messaging feature to personalise and target pastoral and pedagogical support for select groups of students. Many set up SRES 'web portals' to provide personalised feedback and support in the form of a web page that can be embedded in the LMS. Some are even experimenting with the predictive analytics engine powered by machine learning.
From speaking with academics who use the SRES, we think that this approach, as simple as it is, might be one answer to what they have been looking for in learning analytics: a platform that is teacher-driven and student-focussed, allowing them to connect meaningfully with more students at scale.
Find out more about how the SRES works and how you can get involved by attending the ASCILITE Live! webinar on Wednesday 28 June. Full details on the webinar are available here.