What can be learnt from the challenges faced in the use of learning analytics in tertiary institutions, when considering its application in secondary education?
photo by Frank Dabek
I posed this question after reading several sources regarding the use of learning analytics in education. As a secondary school teacher I was interested in finding out if there was anything to be learnt about the application of analytics in tertiary setting , before it is embedded into secondary schooling. The NMC 2013 Horizon Report claims that within 2-3 years it will have developed beyond the 20% penetration point. After summarising sources I found common themes in the challenges faced when utilising learning analytics.
- Driving forces behind analytics
- Error correction and data override
- Collection of valuable data
- Ethics, morals and privacy
I will evaluate the considerations of each challenge when applied in the secondary context to raise the achievement of learners and inform successful teaching.
Driving forces behind analytics
The first area is a personal concern of mine and I think it is extremely important to create a system which is predominantly beneficial to the learner and teacher. +George Siemens and Long (2011) make the comparison between learning analytics and academic analytics - with learning analytics
being of benefit to the learner, and academic analytics benefiting the academic institution. Recent policy change (Education.gov.uk, 2013) in the UK has allowed schools to introduce performance related pay for secondary school teachers, academic analytics would really fit in with assessing teachers and students results. Supported by investment from software companies, the emphasis would be on end results and analysis of value added to education rather than the real time guidance and support that learning analytics can achieve as Siemens and Long (2011) reinforce. In this case, the analytics would be driven by an academic motive and has less benefit to the learner. To be used in secondary schooling to raise and benefit success of learners a policy framework must be established that supports the use of learning analytics, as opposed to academic analytics. The use must be driven by pedagogy rather than institutions.
Pea (2013) also touches on the risks of stereotyping students with analytics. He establishes that we have labels for students already (ADHD, Autism, Dyslexia), and with the increased amount of analysed data available we will have the potential of creating more boxes to place students in? How do we overcome the stigma attached to becoming a certain type of learner, as determined by learning analytics. Do we allow the students, parents and other teachers access to this data or could it have a detrimental effect to their learning progress? This also relates to the common ethical theme that ran through the artefacts.
It is essential at this stage in the development of learning analytics that we proceed cautiously with data used in the secondary context, hopefully we will be able to adapt algorithms successfully from tertiary level and ensure that the type of data collected is as accurate as possible. If we could attain the level of the ‘whole person analytics’ that Siemens discusses in his interview with Watters (2011) I would be more confident in the use of learning analytics in schools.
As immersive environments like the virtual laboratories mentioned in the NMC Horizon Report (Johnson et al, 2013) and augmented reality (Martin et al, 2011) are used more in education it will be easier to collect data, but it will still be difficult to determine data for the complete learner. Perhaps the introduction of galvanic skin response (GSR) bracelets and webcams to measure emotions would be able to create such data, however this is controversial. The Gates Foundation funded a project to use GSR to track physiological responses, there was uproar from teachers as well as parents (Kroll, 2012). This type of data may be available voluntarily from students when wearable technology is more prolific, but it is still difficult to force learners to submit personal data.
A less intrusive way, but still challenging freedom and privacy rights is the use of virtual machines within a students learning device, this will go someway towards creating the profile of the learner (Pardo & Kloos, 2011). An important consideration that I am intrigued by, especially with teenagers (who we know are very aware of what people think of them) was mentioned by Siemens in his interview with Watters (2011), the Hawthorne effect may also make collecting valuable data complicated.
“the way forward is not to delve into our toolkit of existing solutions and applying them to problems taking a known shape. We must walk forward with an adaptive mindset—recognizing pattern changes and adjusting as the environment itself adjusts.” Siemens (2006)
The integration must be a collaborative effort between the institutions driving the change - the teachers applying the change and the learners who are creating the change. We must understand the analytical process and be aware of their limitations and implications when applied to learners. We can then move forward, promoting an informed change in secondary education.
being of benefit to the learner, and academic analytics benefiting the academic institution. Recent policy change (Education.gov.uk, 2013) in the UK has allowed schools to introduce performance related pay for secondary school teachers, academic analytics would really fit in with assessing teachers and students results. Supported by investment from software companies, the emphasis would be on end results and analysis of value added to education rather than the real time guidance and support that learning analytics can achieve as Siemens and Long (2011) reinforce. In this case, the analytics would be driven by an academic motive and has less benefit to the learner. To be used in secondary schooling to raise and benefit success of learners a policy framework must be established that supports the use of learning analytics, as opposed to academic analytics. The use must be driven by pedagogy rather than institutions.
Error correction and data override
Ferguson (2012), Pea (2013), as well as Siemens and Long (2011) all specify that learning analytics must appreciate the ‘softer’ side of the learning process for it to be beneficial to the learner. ‘Learning is messy’ (Siemens and Long 2011 p. 8), effective teachers relate to the learners and appreciate challenges they face in a professional context, as well as the wider pastoral role. The algorithms used in learning analytics can not encompass such requirements at this stage, therefore there must be some level of override to avoid errors. Parser (2011) highlights that we are now being pushed into a “filter bubble” where the data collected from our online activities actually restricts our horizons. As educators we cannot allow the analysis of data to narrow learning choices and minimal progression. There is potential danger of seriously limiting the learning journey for younger pupils, having a greater impact on lifelong learning pathways.Pea (2013) also touches on the risks of stereotyping students with analytics. He establishes that we have labels for students already (ADHD, Autism, Dyslexia), and with the increased amount of analysed data available we will have the potential of creating more boxes to place students in? How do we overcome the stigma attached to becoming a certain type of learner, as determined by learning analytics. Do we allow the students, parents and other teachers access to this data or could it have a detrimental effect to their learning progress? This also relates to the common ethical theme that ran through the artefacts.
It is essential at this stage in the development of learning analytics that we proceed cautiously with data used in the secondary context, hopefully we will be able to adapt algorithms successfully from tertiary level and ensure that the type of data collected is as accurate as possible. If we could attain the level of the ‘whole person analytics’ that Siemens discusses in his interview with Watters (2011) I would be more confident in the use of learning analytics in schools.
Collection of valuable data
Brown (2012) states that most schools have had an LMS since 2005, but there is a wide range of varying systems and platforms. For us to move forward in the secondary context we must insist on data interoperability, not only between the LMS’s but also between any other systems (Khan Acadamy, Knewton) that a student may use to guide learning. As a Google Apps for Education (GAFE) user I know that there are complete districts in the US that share the same GAFE LMS, if the API’s could be made available to enable software developers to create a program to analyse all data from all students then we would be moving closer to the LMS 3.0 that Brown proposes. As Dawson states in Clarke and Nelsons (2012) article, schools need to buy a complete system which works across all platforms - perhaps this is an area that the +Network for Learning (N4L) (N4l.co.nz, 2013) will incorporate in their plan for New Zealand.As immersive environments like the virtual laboratories mentioned in the NMC Horizon Report (Johnson et al, 2013) and augmented reality (Martin et al, 2011) are used more in education it will be easier to collect data, but it will still be difficult to determine data for the complete learner. Perhaps the introduction of galvanic skin response (GSR) bracelets and webcams to measure emotions would be able to create such data, however this is controversial. The Gates Foundation funded a project to use GSR to track physiological responses, there was uproar from teachers as well as parents (Kroll, 2012). This type of data may be available voluntarily from students when wearable technology is more prolific, but it is still difficult to force learners to submit personal data.
A less intrusive way, but still challenging freedom and privacy rights is the use of virtual machines within a students learning device, this will go someway towards creating the profile of the learner (Pardo & Kloos, 2011). An important consideration that I am intrigued by, especially with teenagers (who we know are very aware of what people think of them) was mentioned by Siemens in his interview with Watters (2011), the Hawthorne effect may also make collecting valuable data complicated.
Ethics, morals and privacy
The general consensus between Siemens and Long (2011), Ferguson (2012) and Dringus (2012) is that the use of the data collection and use should be transparent. In a secondary context it would have to be made available to everyone involved in the students learning. The bureaucracy involved could slow the development of learning analytics, perhaps schools could look to other systems for guidance, health care overcame privacy issues to successfully use personal data and predictive modelling from patients to reduce illness and disease (Cortada et al. 2012).Integration of learning analytics in a secondary context
Learning analytics is in its infancy in the secondary sector. If the predictions are correct in the Horizon Report (Johnson et al, 2013) then we will be looking at over 20% penetration point within the next 2-3 years. If this is to happen I believe that we need to develop LMS 3.0 as Brown (2011) emphasises. This needs to incorporate successful data collection techniques, transparent privacy laws and be applied by the teachers who know their students - It needs to have a metacognitive element for it to work effectively.“the way forward is not to delve into our toolkit of existing solutions and applying them to problems taking a known shape. We must walk forward with an adaptive mindset—recognizing pattern changes and adjusting as the environment itself adjusts.” Siemens (2006)
The integration must be a collaborative effort between the institutions driving the change - the teachers applying the change and the learners who are creating the change. We must understand the analytical process and be aware of their limitations and implications when applied to learners. We can then move forward, promoting an informed change in secondary education.
References
Anderson, C. (2008). The End of Theory: The Data Deluge Makes the Scientific Method Obsolete. [online] Retrieved from: http://www.wired.com/science/discoveries/magazine/16-07/pb_theory [Accessed: 19 Aug 2013].
Brown, M. (2011). Learning analytics: the coming third wave.
EDUCAUSE Learning Initiative Brief, 1-4.
Clarke, J. and Nelson, K. (2012). Perspectives on Learning Analytics: Issues and challenges. Observations from Shane Dawson and Phil Long. The International Journal of the First Year in Higher Education, 4 (1), pp. 1-8. Retrieved from:https://fyhejournal.com/article/viewFile/166/173 [Accessed: 28th July, 2013].
Cortada, J. W., Gordon, D., & Lenihan, B. (2012). The value of analytics in health care. IBM Institute for Business Value IBM, Global Business Service.
Dringus, L. P. (2012). Learning Analytics Considered Harmful. Journal of Asynchronous Learning Networks, 16(3), 87-100.Education.gov.uk (2013). School teachers' pay and conditions 2013 - The Department for Education. [online] Retrieved from: http://www.education.gov.uk/g00227186/school-teachers'-pay-and-conditions-2013 [Accessed: 1 Sep 2013].
Ferguson, R. (2012). The state of learning analytics in 2012: A review and future challenges. Knowledge Media Institute, Technical Report KMI-2012-01.
Johnson, L., Adams, S., Cummins, M., Estrada, V., Freeman, A., & Ludgate, H. (2013). The NMC Horizon Report: 2013 Higher Education Edition. 1-40.
Kroll, L. (2012). Gates Foundation Responds To GSR Bracelets Controversy. [online] Retrieved from: http://www.forbes.com/sites/luisakroll/2012/06/13/gates-foundation-responds-to-gsr-bracelets-controversy/ [Accessed: 1 Sep 2013].
Martin, S., Diaz, G., Sancristobal, E., Gil, R., Castro, M., & Peire, J. (2011). New technology trends in education: Seven years of forecasts and convergence. Computers & Education, 57(3), 1893-1906.
N4l.co.nz (2013). N4L | About. [online] Retrieved from: http://www.n4l.co.nz/about/ [Accessed: 1 Sep 2013].
Pardo, A., & Kloos, C. D. (2011). Stepping out of the box: Towards analytics outside the learning management system. Proceedings of the 1st International Conference on Learning Analytics and Knowledge. ACM.
Parser, E. (2011). Beware online "filter bubbles". [video online] Available at: http://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles.html [Accessed: 1 Sep 2013].
Pea, R. (2013). Emerging opportunities in K-12 learning analytics. [video online] Available at: http://www.youtube.com/watch?v=27UW5DKRpOg [Accessed: 31 Aug 2013].
Siemens, G. (2006). Knowing knowledge. Lulu. com.
Siemens, G. and Long, P. (2011). Penetrating the fog: Analytics in learning and education. Educause Review, 46 (5), pp. 30-32. Retrieved from: http://www.elmhurst.edu/~richs/EC/OnlineMaterials/SPS102/Teaching%20and%20Learning/Penetrating%20the%20Fog.pdf [Accessed: 24th July, 2012].
Watters, A. (2011). How data and analytics can improve education. [online] Retrieved July 19, 2013, from http://strata.oreilly.com/2011/07/education-data-analytics-learning.html
Comments
Post a Comment