IT Brief Australia - Technology news for CIOs & IT decision-makers
Story image
Monash University and collaborators bring learning analytics to life
Tue, 14th Jul 2020
FYI, this story is more than a year old

A new model for learning analytics has been developed by Monash University researchers to help creators and innovators better understand the demands and requirements of educational technology.

This has followed a systematic literature review of learning analytics dashboards.

The researchers state they were driven to better understand the role of technology in educational settings, from the widespread use of learning management systems to social media, interactive simulations, and learning games, as well as the growing capacity for capturing data.

According to the researchers, despite learning analytics dashboards being frequently developed by education systems and technology vendors with the goal to support self-regulated learning, there is little understanding of how the current generation of learning analytics dashboards is equipped to support the development of self-regulated learning.

As a result, a team from the Faculty of Information Technology (IT), in collaboration with colleagues from the University of Edinburgh and the University of South Australia, developed a model for a user-centred learning analytics system.

It consists of four dimensions that are interconnected, including scientific research of learning and education, human-centred design, educational feedback, and evaluation.

By conducting an analysis of existing empirical studies about the use of learning analytics dashboards, Faculty of IT lead researcher Professor Dragan Gasevic and his team found that existing learning analytics dashboards are rarely grounded in recommendations established in educational research.

Gasevic says that the new user-centred learning model will emphasise critical properties of self-regulated learning by focusing on metacognitive, cognitive, affective, and behavioural aspects of learning and guide the future work of developers, researchers, and adopters, to create better learning systems.

He says, “Despite the growing adoption of learning analytics dashboards, there are many limitations in the design of their systems which our research has identified.

“Particularly, learners find it hard to interpret the data presented in dashboards and to make use of the feedback presented in dashboards to inform future learning strategies.

The results also showed that learning analytics dashboards cannot be suggested to empower students to advance the understanding of their own learning.

Current learning analytics dashboards often fail to offer advice to students and teachers on the use of effective learning tactics and strategies and have significant limitations in how their evaluation is conducted and reported, the researchers state.

Gasevic says, “Another major concern is that the impact of learning dashboards and recommendation systems on student learning and success is found to be relatively low.

“Feedback presented in learning analytics dashboards can also be difficult to translate into a meaningful actionable recommendation to guide students in their learning.

Properly developed learning analytics tools can provide teachers with additional insights into student learning strategies and also provide students with personalised advice on their performance, the researchers state.

In conclusion, the researchers state the value of learning analytics to support the development of self-regulated learning is voiced by many stakeholders.

In fact, it is especially relevant in the times of digitalisation in which policymakers and education leaders are acknowledging self-regulated learning as essential for the future of life and work.