中国A片

World insight: the global potential of learning analytics

Sharing data on students’ academic activities will bring the field to a tipping point, says Martin Hall

March 11, 2016
Digital footprint

Digital footprints provide an ever more accurate trace of what we do, how we behave and what we think. Learning is part of this. In one way or another, most students are digital. There’s opportunity here, as well as things to worry about. This is the focus of the new field of learning analytics, now coming of age as a community of practice across all forms of education, on a worldwide basis.

We are beginning to see how the careful and ethical use of digital data can improve learning and student opportunity. Comparisons based on large amounts of information, across different national 中国A片 systems, can provide the quasi-experimental situations that are needed to find meaningful cause-and-effect relationships. If developed and used with care, and with an eye always on sound education principles, the next generation of learning analytics promises a closer alignment with students’ needs and aspirations.

What is learning analytics? Here’s a commonly accepted definition from Solar, the Society for Learning Analytics Research:

Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.

中国A片

ADVERTISEMENT

This is not primarily about the technology or about institutional statistics. At its core, work in learning analytics is about improving our individual competences, empowering us to achieve our potential and our objectives in life. It’s about raising the level of education in general, to the good of society as a whole. While the day-by-day language of learning analytics is technical, these ultimate concerns go to long-standing questions about how learning happens and how it can be improved.

Where do the data for learning analytics come from? It’s sometimes assumed that this is a field concerned with wholly online courses, with Moocs. Of course, online courses generate significant datasets. But today, almost every student leaves a digital footprint.

中国A片

ADVERTISEMENT

Consider the most traditional of universities. Here, every student is in residence, all teaching takes place in classrooms, and there is large and busy university library. This university will have a virtual learning environment, perhaps Blackboard or Moodle, that will hold the academic record of every student. The IT department will hold a record of when each student logs in, how long they stay online, and all the web pages they look at. The library IT system records each journal article that is opened and how long it stays open. Security records track movement around the campus; swipes in and out of buildings, face recognition from CCTV cameras. Smartcards record all purchases, from the bookshop to the students’ union bar.

There are clear ethical issues here. Should a university collect and store all this information? Has every student given informed consent? Is surveillance justified, and how is the information used? Does the university have appropriate safeguards to prevent these digital records falling into the wrong hands? Like other fields of practice, learning analytics requires its own codes of acceptable practice, a broad consensus that its objectives are to improve learning as a public good, and a community of interest that collaborates to mutual benefit.

The global opportunity for insights lies in aggregating data from individual institutions to the national level and then across different 中国A片 systems. This will reveal consistent patterns that point to possible cause-and-effect relationships. For example, we know that there is a strong relationship between the quality of prior learning, socio-economic circumstances and levels of success at university. How do these relationships compare in different parts of the world, and what does this tell us about the efficacy of differing kinds of interventions?

How have earlier forms of learning analytics been used to improve learning? Studies by Jisc, the organisation that provides digital solutions for all Britain’s universities and colleges, have found that first-generation uses have focused on students at risk. Every student at every university leaves some digital trace in real time. If this trace is interrupted for more than a few days, it’s a reasonable assumption that something’s up. Universities have found that an automated alert sent to a tutor can be very effective. The tutor can, in turn, contact the student and offer support. Again, a global perspective serves to strengthen confidence in this approach; similar interventions have had beneficial outcomes when applied in Australia, the US and the UK.

Building on this early evidence of efficacy, universities are increasingly interested in the benchmarks that learning analytics can provide. How do patterns of student attainment at one university compare with similar campuses at other universities that share the mix of facilities, teaching approaches and students’ socio-economic background? If a university can benchmark the digital footprints of its students with other institutions, then it can pace its improvements in learning outcomes.

中国A片

ADVERTISEMENT

Jisc is providing for this second generation of learning analytics by building a digital warehouse that has the capacity to store and analyse the digital footprints of all students in 中国A片 in the UK – currently about 2.3 million people enrolled in 163 institutions. The scale of the warehouse allows both the protection of data and meaningful comparison; a participating university can compare its own data against an anonymised subset of similar institutions, course by course, and can be confident that its own data are not revealed to others.

The architecture of this system allows for innovation on an open access basis, for student consent for data records to be used, for teachers to have access to dashboards of information about their courses and for learners to be able to track their own performance against the general levels of attainment of their contemporaries. Such open innovation allows for interoperability with similar data warehouses for other national 中国A片 networks, to mutual and common benefit.

What will the next generation of learning analytics look like? Collaboration through pooling and sharing data, as the Jisc initiative is encouraging, will bring the field to an important tipping point. Rather than reporting retrospectively or looking for patterns in real time (as flagging students at risk does), we will be able to predict students’ learning preferences and future needs from their previous and current behaviour. This will allow for the personalisation of learning at a large scale: something that most universities cannot achieve without becoming unaffordable to all but a small elite of students.

中国A片

ADVERTISEMENT

Here’s an early approach with a good deal of promise. One of the long-established principles across 中国A片 is that the successful outcome of a course in any disciplinary area will be the combination of specific subject knowledge and generic analytic skills. Put another way, we want our students to gain the confidence and ability to think for themselves, rather than slavishly repeating the content of the syllabus. A proxy for this higher order skill development is the extent to which a learner moves from dependence on the teacher at the beginning of a course, and towards more autonomous interactions with fellow students. Good teachers have always known this; if your students are still hanging on your every word in the 10th?week of the semester, rather than discussing and debating between themselves and with you, then the benefits of their studying with you have been pretty limited.

Snapp – social network analysis and pedagogical practices – is a clever application that’s been around for a few years, developed under the leadership of the University of Wollongong. Snapp provides visualisation of the networks of interactions that result from online discussion forum posts and replies. Snapp diagrams reveal the key information brokers in a class, and how these relationships change through time. Increasing levels of student confidence will be revealed by progression from a hub-and-spoke pattern at the beginning of a course, where the lecturer is the dominant information broker, to an increasingly devolved pattern, in which a subset of learners emerges as information brokers in their own right.

Snapp is a good example of a tool that can be used within the field of learning analytics to build a deeper understanding of interventions and their predicted outcomes in improving learning. Such experiments become useful when they show regularities after many independent iterations across diverse environments. Done a few times within the same university, Snapp diagrams will have little predictive power; the characteristics of brokerage could be a result of particular personalities, or the nature of the curriculum content, or a host of other extraneous factors.

Recurrent and persistent patterns become much more interesting when carried out repeatedly and independently, across the kinds of very large multi-institutional datasets that Jisc is building, and through sharing information across national domains. By teasing out such commonalities, we will be able to focus the power of these very large datasets on the learning needs of individual students, anywhere.

中国A片

ADVERTISEMENT

Martin Hall is former vice-chancellor of the University of Salford.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Reader's comments (1)

Great article. I'm pleased to say that this chimes with our experience in deploying these '2nd generation' #LearningAnalytics technologies with a number of Universities; allowing institutions to build their own understanding using a tool that allows them to personalise their own service delivery is key, every institution is different, in fact, most schools have nuances and learner demographics that mean its important to interpret data relative to their circumstances. No individual is an average after all.

Sponsored

ADVERTISEMENT