#Change11 #CCK12 #LAK12 Educational Challenges and Learning Analytics

The notion of scientist needs to think like a scientist, an artist to think like an artist, a doctor to think like a doctor and an educator to think like an educator is prevalent.  That is the same for students, as they might base their behavior on what they could share and learn with their knowledgeable others, their professors, their admired professionals within their domains.  This is rational modelling rooted from an educational philosophy – zone of proximal development by Vysgotsky throughout the past decades.

The present challenges that most of us as educators are facing include:

1. How to ensure that our education is aligned with both the stakeholders needs and expectations and our students?  Our stakeholders could include the government, education authorities, our employers, administrators, the students’ parents.  Since each stakeholder has certain expectations which may be in conflict or tension with others, this requires educators and learners to understand why their urges or needs are not easily fulfilled.  An example is the current demand of more customized education and learning, as shared in my previous post on learning with MOOCs and future education.

2. How to ensure that professionalism and calling are enculturated in the community?  This has been a challenging issue.  What would be the responsibility of educators to the stakeholders and students?  What are the assumptions that are associated when taking those responsibilities.  Teachers educated have often assumed that they should be responsible for “teaching and supporting their students”.  When there was a failure in performing by the students, teachers were first called into place for explanation of those failure. What should be the proper intervention by the educators?  What would learners expect from the educators, in terms of support and guidance?

The emergence of Learning Analytics as reported by Rebecca sound promising.  “These included tools which applied statistical tests in order to predict, while courses are in progress, which students are in danger of falling behind.  The aim is to produce actionable intelligence, guiding students to appropriate resources and explaining how to use them.”

Although these all seem to provide wonderful strategies and tools for educators, there are concerns relating to ethics, privacy, ownership of data and personal security.  Have we got the permission from our learners on the personal data collected from them?  As educators, to what extent should we judge our learners in terms of their behavior and interaction as revealed in the learning analytics?

Jenny referred in her post:

“Analytics may in time come to be used to judge you — as a learner, an educator, or your institution. The challenge for us is to debate what it means for this new breed of performance indicators to have pedagogical and ethical integrity. What can and should we do, and what are the limits? Do they advance what we consider to be important in learning, teaching, and what it means to be a higher education institution in the 21stCentury?”

I think the use of push analytics under the “big data” would soon become a way all learners, educators and institutions would be judged as evidenced in the learning analytics.  I reckon we should be cautious in interpreting the data from the tools, as they are based on the assumptions that such measurements are reflective of the students’ performance of achievement in their learning in the course.  To what extent is this valid?

In my previous post I mentioned that: “Educators are in need for non-intrusive and automatic ways to get feedback from learners’ progress in order to better follow their learning process and appraise the online course effectiveness.”

As recommended by Roy in his comments to my post, more dialectics is required to reveal the learning patterns of learners.  This could be achieved  through nested narratives by learners, but that also means that we need to take ethics, integrity and ownership into consideration in the narrative research and learning analytics process.

10 thoughts on “#Change11 #CCK12 #LAK12 Educational Challenges and Learning Analytics

  1. Pingback: #Change11 #CCK12 #LAK12 Educational Challenges and Learning Analytics | Educación a Distancia (EaD) | Scoop.it

  2. Pingback: #Change11 #CCK12 #LAK12 Educational Challenges and Learning Analytics | gpmt | Scoop.it

  3. Learning analytics are very interesting, but as you noted, we must develop and use broad measures to determine whether learning has occurred. I recently began to use Twitter in one course. The results of the first exam were very positive (as I noted here http://idajones.wordpress.com/2012/03/03/twearning-twitter-learning-first-exam-results/) but even then, I don’t know how much of the improvement is due to other factors. Part of the concern as faculty is obtaining assistance to develop, interpret, evaluate and revise coursework based on the information and also assistance to develop accurate measures of learning.

  4. Hi, What do you think of my image of the future?: Learning Analytics needs a certain kind of testing and a certain way of harvesting results of students. The easiest way for testing and harvesting is to collect data of a certain kind, data that can be analysed. It is a fact of life, it is almost an educational law, that this Learning Analytics will influence the content of education and the process of teaching. It will be like learning for the test, learning for the analytics. Because analytics will be used in now yet unknown ways we need to think it over.

  5. Thanks for sharing Ida. Using Twitter as a way to engage students sounds good to me. What criteria would you use in evaluating Twitter as a tool in learning scientifically? Should one use a controlled experiment with Twitter networking?
    Renewed thanks for your visit.
    John

  6. Assessment is the difficult part. I’d actually recommend a couple of things. I’m comparing grade distribution (which is the very beginning of assessment). Ideally assessment would include a control group comparison (which is part of what I’m doing by looking at the previous semester’s grades) and it might also include some behavior analysis. What would you suggest?

  7. How are your grades determined? Multiple Choice (MC) could be a useful tool for assessment, especially on basic concepts and to certain extent it could assess people’s cognitive strategies and reasoning (i.e. why some choices are more rational and logical than the others). However, I also often found language could be a barrier in MC questions. People could have chosen the wrong answer because of mis-understanding the questions, or the choices, but they could be strong in critical and analytical skills. So may be Twitter could be used to test the “true or false” sort of questions, and see why people have chosen “true” or “false” with their reasoning. I haven’t tried it, but I do think it could (a) challenge people to think why some answers are right or wrong, (b) stimulate people to engage with others, through further conversation, and putting forward with arguments with reasons. So my suggestion of assessment is: “Engage people with interesting, though challenging questions or tasks, and let them have a choice of response” I think some themes which relate to global issues that may be of interests to students: climate change, future jobs, virtual identity etc.

  8. Thank you for your suggestions. My tests are a combination of true-false, multiple choice and short answer questions. Students are given several short answer questions in advance, then one or two of those are selected for the exam. I’ll continue to work on assessment,

  9. Pingback: #Change11 #CCK12 #LAK12 Educational Challenges and Learning Analytics | Digital Delights | Scoop.it

Leave a comment