#Change11 Use of Analytics in College and Higher Education Institutions

Thanks to Tony Bates for his post, here is Use of Analytics in College.

“At Rio Salado College, where all 41,000-plus students attend classes online, instructional priorities include a strong emphasis on personalization–helping nontraditional students reach their educational goals through programs and services tailored to individual needs. To achieve this personalization, the college has implemented weekly starts (in which students can choose to start a class at the beginning of any of 48 weeks throughout the year), 24/7 technology and academic hot lines, easy access to online advising, and now a Progress and Course Engagement (PACE) system for automated tracking of student progress–with intervention as needed.

Several institutions have developed learning analytics tied to their course management systems, specifically to provide early interventions that can help at-risk students. Detecting “at risk” behaviors requires a tracking system and sophisticated data modeling.”

Learning Analytics will in the future be helpful at showing self-directed learners in how far they have reached their learning goals, or educators the achievement of their students. (Rita comments in Tony’s post).

A few farmers had undertaken AI course on the other side of the world.

“The practical component of the exam required farmers to follow the hypothetical instructions from farmers and AI cows on five different farms.

Examiners watched their every move, including if they cleaned their boots between farms and even checked that the placement of the gun delivering the semen was correct.”

How would Learning Analytics be used in MOOCs – such as AI (the open and official standard course, with a link to the Project 1 here), ML Courses, Change 11 or CCK courses?

As Rita mentions here:

2.I like the idea that analytics might make it possible for student support services to be better matched to student needs, but coming from a background in adult education and widening access to Higher Education, I have seen my fair share of problems with using the deficiency model to support learners. I feel more comfortable with
3. the analytics model promoted by Erik Duval who runs analytics on student activities and shows the students the results. This seems more empowering to learners as it involves a need for reflection on their learning.
4. Analytics can also be run as a research tool, so teaching staff might get a better understanding of the learner experience and the problems learners might come across in order to better match their teaching. Caroline Haythornthwaite showed us some of her visualizations of communication and group forming, which highlighted insights that analytics might provide in the ties between learners in learning settings.

I have posted my views on Learning Analytics here and here.

I am reposting it here:

I don’t think my learners would like to be “observed” under those lenses, honestly, and as David mentioned, the interpretation of the findings could become the science.  This reminded me of the Quality Assurance and Improvement, and Total Quality Management movements which emphasises on management by facts and data.  In an institutional setting, any improvement efforts are plan driven, and there is a need of control and intervention exercised by management, managers, and workers, based on a “scientific approach” using statistical analysis and control, together with a range of quality tools to improve and innovate in an organisation.

This may be a perfect solution when learning analytics are applied under a LMS and integrated learning environment, where institution control is of critical importance. However, would this work in an autonomous to semi-autonomous online learning environment such as PLENK?  Would it provide the diagnosis as promised by the learning analytical approach?

To a great extent, I think there is still a gap between conducting and interpretating Learning Analytics and an understanding about its significance both from an educator and learner’s perspective – in particular the ethical dimensions and the privacy issues.  To what extent would participants like to be “analysed” under such a system?  If the learners are interested in learning, then the analytics would reveal that some of such learners could be actively participating and contributing to the networked learning.  However, how about those who are not creating or participating that much, but are self-organised learners learning with their own learning pathways?  Surely, these learners may become the outliers as identified in the social network graphs, analysis and statistics.  How would lurkers identified under such analysis be actioned upon?  What would an educator do based on the findings of such Learning Analytics?  Intervention?  Reinforcement of learning? Review of teaching practice? More support provided to the learners? Or a change in the course/network design?

I could however, see the value of Learning Analytics in PLENK, in that it provides a picture and pattern of learning of the network and community.

Here are a few recent resources on Learning Analytics.

George Siemens slides on Learning Analytics:

I like Jon’s slide here, where he highlights the hard and soft part of technology and use of analytics: