Learning Analytics

Perceptions of Progress: Learning Analytics and Social Learning Behaviors

Dr. Deborah Everhart

Slideshare presentation

Recorded presentation, Educause 2012


Learning analytics provide context for social interactions, helping learners visualize goals, progress, and achievements in ways that can positively influence the development of authentic learning communities. This short paper provides multidisciplinary contextualization for applications of learning analytics visualizations, including social exchange theory to understand how learners allocate their attention and motivational analysis of the development of reputation in learning communities.


Traditional teaching models, whether or not they use technology, generally provide students with inadequate ways of judging their learning progress. In courses that emphasize covering the content and then assessing whether or not students have consumed a specified set of materials, the focus is on the quantity of material the student has adequately digested rather than the quality of the learning. The rewards are typically grades or scores distributed infrequently and according to formulas that may be known only to the instructor.

This model, combined with the fact that most students need to learn how to learn, leads to the result that students learn how to budget their time and attention based on what they can discern about grade distribution. Grades provide positive, negative, or neutral feedback periodically, often significantly delayed from the time the assignment or test was submitted. Furthermore, the grades often do not provide adequate context for student understanding of progress (the grades are too fragmented, too few, and/or the calculations for grades are unclear). Many courses do not have clearly stated learning objectives, and/or the objectives are not tied to the course activities in a way that facilitates the students’ understanding of progress toward their grades, much less progress toward lifelong learning goals. This situation is exacerbated by the fact that instructors have limited capacity for more or more detailed grading and feedback activities.

In this context, students have little opportunity to become active learners, particularly if they are adult learners with serious time constraints returning to school or others who may feel forced into grade/degree requirements that they cannot control. This model persists despite research that demonstrates that grades and other extrinsic motivators are generally less effective for cognitively challenging tasks than intrinsic motivations such as “autonomy, mastery, and purpose.” [1] Given little authority over their own learning, students in this model practice little meta-cognition and generally do not master the self-aware inquiry skills needed for most knowledge economy professions.

Furthermore, peer effects can be powerful motivators, but most courses are not designed to facilitate peer interaction, particularly not in a way that weds social learning with authentic learning objectives. Hence students are largely left on their own to determine how they are progressing through a course and whether or not they are learning anything valuable.


As learners make their own attempts to discern progress, sometimes reaching out to their peers for help, access to appropriate data and social interactions around that data can lead to critical inflection points in student behavior. Changes in student behavior can be motivated not only by additional useful inputs, such as data about participation, but also by enhanced communication about what that data means in a social learning context. This is where the emerging body of social learning analytics can be usefully applied, to mak[e] visible, and in some cases potentially actionable, behaviors and patterns in the learning environment that signify effective process. In particular, the focus is on processes in which the learner is not solitary, and not necessarily ‘doing work to be marked’, but is engaged in ‘social’ activity either by virtue of interacting directly with peers…. or using collaborative platforms. [2]

The application of social learning analytics in environments where students are learning to learn can amplify the benefits of peer interactions by providing meaningful data and visualizations that learners respond to as they make judgments about the value of their learning investments.


Social exchange theory is based on the premise that relationships among people are based on an individual’s analysis of the value of the relationship, the rewards minus the costs. The time and effort it takes to develop trust, understand shared interests, and learn how to cooperate, for example, are balanced against the benefits of acceptance, support, and shared outcomes.

Students, like everyone else, have limited time and attention, and therefore decide where to “spend” these “assets” among interactions with the course materials, classmates, teachers, and advisors. Juggling priorities, they may decide that social interactions require too much investment. In their engagement with the course materials, they make brutally calculating decisions about how to apply their time and attention, for example whether to go to class or fast-forward through the lecture recording, or how little time to spend on test preparation to get the needed grade. In a teaching model where the rewards of the course are based on grades and the students have little motivation for authentic learning, calculations that minimize social interactions often appear to be the efficient path.

Since investments in social relationships often have higher costs than simply figuring out how to apply the course materials to getting the grade, social learning opportunities need to be readily available and clearly valuable in order for students to choose to take advantage of them. Even then, students are likely to be calculating in their choices: they want to find classmates who are at least as strong as they are when they form study groups; they choose to combine their notes to get a more comprehensive set; they ask their peers questions to put their own course participation and grades in context, often to find out if there’s an easier way to get the grade. Adult learners may make cost/benefit decisions based on the relevance of the learning experience to their work lives, in some cases a very literal investment in the short-term cost of learning time based on its longer-term benefit to salary and employment opportunities. These dynamics will not change simply because technology provides different social learning opportunities; the motivations for authentic learning must also become part of the mix to elevate the type of social learning that takes place.

If learners are motivated to form higher investment relationships, one path is to find or become collaborators and mentors. And as their learning networks grow, systems of reputation and recommendation form. These can provide benefits not only during a course, but also across one’s learning career. Authentic learning networks are increasingly important in the knowledge economies that are steadily displacing industrial economies. Whether learners are new to the workplace or re-tooling, to achieve professional goals they need to understand the benefits of collaboration and learn how to make cost/benefit decisions beyond the calculation of how to get a grade.


Unfortunately, today’s students are given little insight into what it takes to succeed. Even straightforward technologies for providing evidence of ongoing progress or comparisons with peers are largely missing. For example, even though researchers have demonstrated clear correlations between LMS activity and learning outcomes[3], most students do not have access to learning analytics data for their own or their peers’ LMS activity and outcomes.

At the University of Maryland, Baltimore County (UMBC), researchers have five years’ of data demonstrating a correlation between student performance as defined by grades and activity in the LMS. Students earning a D or F use the LMS on average 39 percent less than students earning higher grades.[4] Rather than keep this data at an administrative level, UMBC provides a “Check My Activity” tool that lets students see their own participation data and compare it to that of others in the same course. Students can also see a grade distribution report that shows correlations between grades and activity on average (anonymously) for students in the course, while the course is in progress, allowing them to make their own judgments and potentially change their behavior mid-course.

This type of exposure to learning analytics can be a powerful motivator. Students become more aware of their activities and time on task in their courses. And while the technology is not comprehensive (some courses have more online activities than others), the balance between online and offline activities can be understood by the student despite the lack of data on the offline activities, once the student has crossed that important threshold of self-awareness.

Recent studies at UMBC show that students who have chosen to use the Check My Activity tool (as an optional way of understanding their progress) are twice as likely as other students to earn a C or higher. Currently this can only be demonstrated as a correlation, but research is underway to determine how and to what extent using Check My Activity changes student behavior in ways that improve grades. Additional dimensions of data are being analyzed to study factors such as overall GPA, comparisons to other students in the same major, and comparisons to students who took the same course from different faculty and/or in different modalities.

The UMBC research is also revealing important correlations between course activity and students’ ability to apply their learning beyond a single course. Students in case study courses that require a high level of activity later score higher on department-wide standardized tests. And students with high levels of activity in case study introductory prerequisite courses are not only more likely to pass those courses, but they also achieve on average half a letter grade higher in subsequent courses that require the prerequisite courses.

How can these achievement gains be amplified? It should not be a surprise that even simple data visualizations on basic data like comparative activity and grades are beneficial. When a student compares her activity and grades to those of others in the course, she can mentally fill in the blanks, understanding what the comparisons indicate. This is where social interactions can become very valuable, as the student discusses her progress in the course with others and combines the data visualizations with shared awareness of others’ experiences. Did others find that reviewing the lectures helped with the test? Do the discussions provide useful context for the course content? Are there others interested in group study? Am I making adequate progress in comparison to my peers?

Based on this increased understanding of the value of specific behaviors, students can make better informed decisions about how to use their limited time and whether or not to change their behaviors. More detailed data can help students make more granular decisions, such as whether to spend more time reading and contributing to discussion forums or more time reviewing lectures to prepare for a test, based in part on comparisons to what other students are doing—which of course is varying in real time. Even if changes in behavior are based on cost/benefit analysis of what it takes simply to achieve a grade, the availability of learning analytics can help students become much more informed and self-aware learners.

Visualizations of learning analytics can influence both student and instructor behavior.  Instructors can identify disconnected students who may be at risk and/or change the way they are facilitating the course. Students can better understand their own role in the learning community of the course and benchmark their own performance. Providing learning analytics data visualizations for those who are shaping and participating in learning activities can facilitate best practices in teaching, promote self-aware learning, and shift the perspective from “black box” teaching and learning to more open dialogue about how learning happens. “When social learning analytics tools and their data are in the hands of learners, who may become at least as proficient if not more so at using them than their tutors or administrators, the balance of power has shifted significantly.”[5]


Taking learning analytics to the next level of value could be fast-tracked by leveraging social interactions and common human tendencies, without necessarily providing complex technology solutions such as recommender engines or automated interventions. For example, the common human tendencies toward goal achievement, peer recognition, and competition are powerful motivators that can be catalyzed by contextualized learning analytics data and relatively simple mechanisms for responding to that data. Data visualizations combined with scaffolding for social reputation building can help learners make cost/benefit decisions based on authentic learning objectives and constructive learning communities, whether or not grades or calculated “marks” are also part of the mix.

For example, iSpot is an Open University collaborative field research site where users add their observations of birds, reptiles, plants, etc.[6] Contributors regardless of their level of expertise become part of a community that learns from each other through shared verification of identifications. The iSpot reputation system is carefully weighted so that novices can learn from experts and gradually earn a higher level of trust and expertise within the system, such that learners can grow to mentor each other. Contributors have access to detailed data about how others have responded to their identifications and how their level of expertise is increasing, achieving icons that show their level of reputation. Social structures are provided for connecting with others through shared interests, identifications, and other collaboration. This framework provides opportunities for learning networks to form around the authentic learning activities of spotting species in the field. Participants’ motivations are real, and their sense of progress is transparent.

A far-reaching example of reputation frameworks is provided by the Mozilla Open Badges for lifelong learning initiative, which provides a framework for identifying and rewarding learning accomplishments from non-traditional learning contexts. The framework includes scaffolding for defining badges that represent the achievement of certain competencies, methods for assessing learners’ accomplishments (including mentor and peer assessment), badge endorsement by experts and authorities, badge collection and display as part of learners’ identities, and the use of badges to evaluate learners’ capabilities for employment, education, and other opportunities. In this “connected learning ecology,” [7] the value of the badges is derived from their social context, including who grants, who receives, and who recognizes these badges.

The badge infrastructure is an open, adaptable framework that can be applied to a range of learning opportunities. After-school programs, internships, online non-accredited courses, work experience, military experience, and other learning contexts can be the inputs to a structure where mentors and experts help learners understand and evaluate their progress toward the accomplishment of specific badges. In this context, what one has learned becomes literally, visibly, part of one’s identity. The personal enrichment benefits of learning provide motivations for further learning, as well as the extrinsic benefits provided by badges. This is a context where the “social knowledge-creation and negotiation infrastructures [are] built on quality relationships and conversations – not impersonal ‘transactions’.”[8]

For example, Carnegie Mellon University has launched a badge project as part of the Computer Science Student Network (CS2N), which provides a distributed learning infrastructure for computer science and STEM skills.[9] Learners participate in a scaffolded environment and work on achievements ranging from entry-level skills to industry certification. Badge pathways provide a clear view of progress, as learners can see directly how lower-level accomplishments lead to higher-level accomplishments. Creative competitions provide additional motivations and opportunities for peer review and learning from others’ work. Should they choose to, learners can progress to the levels of achievement that tie into industry-accepted certifications and entries to employment.

The system also includes learning opportunities for teachers, who can achieve badges demonstrating their pedagogical skill levels and ability to accurately certify the work of learners. This layer of the scaffolding helps scale the system and build reputation systems by which learners (including teacher learners) recognize each others’ accomplishments.

Badge projects are maturing very quickly in many arenas, from informal learning activities for K-12 students to formal systems for identifying and verifying the skills of veterans. Employers are beginning to recognize the value of the more detailed information provided by badges in contrast with the opaque, minimal information provided in college transcripts.

Although the badges movement is still in its infancy, it holds tremendous potential as learners, educational institutions, organizations, employers, and others enter this new “connected learning ecology” and shape it, adding tools, visualizations, processes, and social structures as it evolves. “A key lesson from the social web paradigm… is that when empowered with appropriately flexible tools, an ecosystem grows in which new roles are created for different kinds of user to customise their tools… in new, more effective configurations.”[10]

Helping learners understand their progress toward goals is just the beginning, but an important starting point in leveraging learning analytics and social interactions to facilitate authentic lifelong learning.


Special thanks to the following who have made significant contributions to this paper: John Fritz, Asst. VP, Instructional Technology & New Media, University of Maryland Baltimore County; Erin Knight, Sr. Director of Learning, Mozilla Foundation; and Jeanine Turner, Associate Professor, Georgetown University.



[1] Pink, Daniel H. Drive: The Surprising Truth About What Motivates Us.

[2] Buckingham Shum, p. 4

[3] See references for Arnold, Fritz, Macfadyen, and Morris

[4] Fritz, J.

[5] Buckingham Shum, p. 21

[7] As defined by the MacArthur Foundation’s Digital Media and Learning Initiative, “‘Connected learning’ is: 1) participatory, demanding active social engagement and contribution in knowledge communities and collectives; 2) learner-centered,  empowering individuals of all ages to take ownership of their learning linked across a wide range of settings — in school, at home, and informally with friends and peers; 3) interest-driven, propelled by the energies of learners pursuing their unique passions and specialties; and 4) inclusive, drawing in people from diverse backgrounds and walks of life across generational socioeconomic, and cultural boundaries.” The Mozilla Foundation; The MacArthur Foundation (2012). Open Badges for Lifelong Learning, p. 13.

[8] Buckingham Shum, p. 20

[10] Buckingham Shum, p. 21


Arnold, K. (2010). “Signals: Applying Academic Analytics.” EDUCAUSE Quarterly ,Vol. 33, Number 1

Buckingham Shum, S. and Ferguson, R. (2011). Social Learning Analytics. Available as: Technical Report KMI-11-01, Knowledge Media Institute, The Open University, UK. http://kmi.open.ac.uk/publications/pdf/kmi-11-01.pdf

“The Computer Science Student Network Badge System.” Digital Media + Learning Competition 4. http://dmlcompetition.net/Competition/4/badges-projects.php?id=3213

CS2N: Computer Science Student Network. Carnegie Mellon University and the Defense Advanced Research Projects Agency (DARPA). https://www.cs2n.org/about

Duval, Erik (2011). “Attention Please! Learning Analytics for Visualization and Recommendation.” Proceedings of LAK11: 1st International Conference on Learning Analytics and Knowledge 2011. (pp. 9-17)

Emerson, Richard M. (1976). “Social Exchange Theory.” Annual Review of Sociology, Vol. 2 (1976), pp. 335-362

Fritz, J. (2011). “Classroom walls that talk: Using online course activity data of successful students to raise self-awareness of underperforming peers.” Internet and Higher Education, Volume 14, Issue 2 (pp. 89-97)

Hagel, J.; Seely Brown, J.; Davison, L. (2010). The Power of Pull. Basic Books: NY

iSpot, your place to share nature. The Open University, UK. http://www.ispot.org.uk/

Macfadyen, L.; Dawson, S. (2010) “Mining LMS Data to Develop an ‘Early Warning System’ for Educators: A Proof of Concept.” Computers & Education, vol. 54, no. 2 (pp. 588–599)

Morris, L.; Finnegan, C.; Wu, S. (2005). “Tracking Student Behavior, Persistence, and Achievement in Online Courses.” The Internet and Higher Education, vol. 8, no. 3 (pp. 221–231)

The Mozilla Foundation; The MacArthur Foundation (2012). Open Badges for Lifelong Learning, https://wiki.mozilla.org/File:OpenBadges-Working-Paper_012312.pdf

Pink, D. (2011). Drive: The Surprising Truth About What Motivates Us. Riverhead Trade.

Siemens, G. (2012). “Leaping the Chasm: Moving from Buzzwords to Implementation of Learning Analytics.” EDUCAUSE Live! http://www.educause.edu/Resources/LeapingtheChasmMovingfromBuzzw/246001

Siemens, G. (2011). “Learning and Academic Analytics.” http://www.learninganalytics.net

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 Unported License.

Leave a Reply