Limitations of Awareness Support in Education and Learning

Last week the SURF Academy organised a seminar on learning analytics. Hendrik nicely twittered from the event, so I was able to follow it. After he posted a comment about measuring the performance of teachers I needed to respond. My prime criticism is that the type of analytics that he describes is not learning analytics, but pretty boring performance benchmarking and that this if done by the wrong people might has legal implications that are beyond what Hendrik and Wolfgang outlined in their presentation

I worked on the topic for several years, although I do not use the currently popular term "learning analytics" because it emphasizes the statistical procedures over the actual or potential use and usefulness of the resulting data. Instead, I prefer the term "awareness support" because it includes the purpose of how the data should be used and helps to focus on appropriate solutions. The entire topic is very new and needs some clarification. In this article I try to focus on my understanding of what "learning analytics" is about. 

What is Learning Analytics?

Learning analytics is a nice fuzzy new term that is currently in the making. I believe it is very important to define learning analytics very precisely in order to avoid confusion with other applications of analytics. I think Hendrik agrees.

My 140 character version of a learning analytics definition is 

The prime task of #learninganalytics has to be the support of those who are directly involved in the learning process. 

Ok, this is pretty fussy and so Wolfgang Reinhardt asks who are those directly involved actors. Interestingly, his comment describes a pretty old perspective of educational and learning processes that involve teachers and students (note: teachers first). 

Consequently, it requires a better understanding of what learning processes are. For my Ph.D. thesis I used the following definition of learning processes: 

Learning processes refer to those interactions of a person with an environment that result in changes of behaviour, knowledge, or attitude.

There are a few important things to remark about this tag line besides the fact that it describes learning without mentioning teachers.

Firstly, learning processes are interactive processes. Different to context-free stimulus-response sequences, interactive processes are context-sensitive action-feedback loops. This means the actions of a learner have an effect and that this effect is observable by the learner as a feedback on the initial action. Within this definition feedback is fundamental for learning. 

Secondly, learning happens always in interaction with an environment. It can be a laboratory or a garden where the learning person works alone; it can be the peer group or a class in which the learner's actions create socially grounded reactions; or it can be a (computer) game that is independent from real-world social or environmental constraints.

Thirdly, learning results in a change. The definition states "behaviour, knowledge, or attitude" but in the end it boils down to changed behaviour because the other two dimensions refer to cognitive states that are not observable. Two primary types of behavioral change are widely referred in the educational literature: effectiveness and efficiency. Effectiveness of learning means that the learning person shows new behaviour and efficiency means in this context that the learner does something better than before. 

When I talk about "those who are directly involved in the learning process" I imply the learner and everybody in the learner's environment who can or should provide feedback to the learner. This can be peers, parents, teachers, managers, or colleagues. This is not the teacher's supervisor who criticizes teaching efficiency and this is certainly not the provider of the virtual learning environment that happens to be used in a course.  Two years ago, we have presented a paper on the ECTEL 2009 conference about who is in control for personalizing learning experiences. It should give a good introduction into this line of reasoning.

Returning to the original question about "what is learning analytics?" Learning analytics is not just about analyzing data but also about monitoring processes. It is crucially important to understand that it is not possible to have one without the other. The occurrence of learning in the term indicates that the type of analysis is directly related to the action-feedback loop. Although the theory sounds simple, supporting action-feedback loops requires a great deal of situational awareness both of the learner as well as of anybody who can support learners. Therefore, solutions for learning analytics intend to help improving the situational awareness of actors in learning processes. Tracked data can be highly valuable for boosting situational awareness in learning processes. This is why I prefer to talk about awareness support instead of learning analytics. 

I distinguish two meta-types of analytical data as learning analytics.

  1. Data as feedback. This kind of analytics help to generate meaningful responses that helps learners to develop a better understanding about their learning process and supports them to regulate their learning activities. Reflection support widgets and recommender systems are good examples for this data as feedback approaches.
  2. Data for feedback. This kind of analytics helps other actors or systems in a learner's environment to get information to provide some form of feedback or supportive activity. Teacher dashboards that help to identify learners with support needs are widely available solution for this type of analytics. 

In my opinion, if an approach does not fall into these two categories, it is very likely that it is not about learning analytics but about analyzing something for a different purpose.

Limitations of and challenges for Learning Analytics

The biggest challenge of learning analytics is to provide appropriate (social) information for learning support on the one hand side, and to ensure data protection on the other hand. When talking to educational practitioners (cyber) bullying has been a major concern. More recently, also caused by inspiring research results from public social networks, educational practitioners realise that their personal data could be at stake as well (and potentially used against them). That these concerns are not coming out of the thin air, shows Hendrik's comment on "educational business analytics". The problem at hand is that the line between learning analytics and educational business analytics is thin and is easily crossed. 

In times of google and facebook tracking huge amounts of personal data one might argue that this discussion renders obsolete sooner or later. Personally I am not that optimistic because of existing legislation on data tracking and employee monitoring. For example, Switzerland and Liechtenstein forbid systematic employee monitoring altogether and limit monitoring to the law enforcement organisations. Although no such legislation exists for Austria or Germany, permanent activity monitoring at the workplace is considered as legal only in the case of strong suspicion of a criminal offense (basic overview on the German case). I heard that similar legal views are shared by other European countries as well. 

This affects learning analytics, because most educational practitioners are employed and most approaches of activity monitoring require systematic data inquiry. If learning analytics is not limited to the above definition, it goes beyond monitoring for production purposes, which is typically considered as legal. 

The driving question is "who has access to what analytical data?"

The naïve approach to data protection is to restrict access to monitoring data all together, both technically and legally. This assures that nobody has access to potentially compromising data. As a consequence of this approach also nobody has access to potentially useful information as well. This approach is very popular where legislation about monitoring data is very rigid or the legislation is unclear (like in Austria).

A more relaxed variant provided by many educational systems is to provide instructors with monitoring data for their unit of learning (or course) and limit the access to this data for the duration of that unit of learning. Although this solves some problems, it limits possible learning analytic approaches to the data for feedback type. Another drawback of this approach is that historic data is erased and lost.

Alternatively, I proposed layered social perspectives for filtering and anonymising data. These perspectives are directly build into analytical functions that enable learners and teachers to access relevant information on the on hand, but restricts the data granularity depending on the role of the person who requests an analysis. That way even learners can get anonymous analytical information about their peers. 


It is important that future development of learning analytics build on a strict definition for identifying approaches. I propose a relatively narrow learning process centered definition that links to the concepts of situation awareness. Having "educational business analysts" hijack the term would be highly undesirable because the legal constraints of employee protection would then hinder to exploit the full potential of analytical approaches for learning support.