Archive for the ‘learning Analytics’ Category

Double Loop Learning and Learning Analytics

May 4th, 2016 by Graham Attwell

Another in this mini series on Learning Analytics. When looking at Work based learning, Double Loop Learning becomes particularly important. Double-loop learning is used when it is necessary to change the mental model on which a decision depends. Unlike single loops, this model includes a shift in understanding, from simple and static to broader and more dynamic, such as taking into account the changes in the surroundings and the need for expression changes in mental models.(Mildeova, S., Vojtko V. ,2003).

double loop learning

To remind readers again, in the EmployID European project we are aiming to support scalable and cost-effective facilitation of professional identity transformation in public employment services. And I would argue such identity transformation is based on refection on learning, on Double Loop Learning. Identity transformation necessarily involves the development of new metal models and new ways of looking at work based behaviours and practices.

So where does Learning Analytics fit into this? Learning analytics aims to understand and improve learning and the learning environment. This does not necessarily involve Double Loop Learning. For students feedback about their present performance may be enough. But if we aim for identity transformation and wish to improve the learning environment then we need a deeper interpretation of data. This has a number of implications in terms of designing Learning Analytics.

Firstly we have to have a very clear focus on what the purpose of the Learning Analytics is. Is it to find out more for example about informal learning in organisations or to inform L and D department staff about the Learning environment. Is it to help learners understand about their interactions with other staff or to examine their own dispositions for learning – and so on? Secondly – and crucially who is that data presented to users – be it learners or trainers. The existing parading for Learning Analytics presentations appears to be the dashboard. Yet in the LAk16 pre conference workshops there were a whole series of presentations where presenters invited participants to say what the graphics meant. And often we couldn’t. If LA professionals cannot interpret data visualisations then a leaner has little hope of making their own meanings. I am a little puzzled as to why dashboards have become the norm. And one of my major concern is that often it is difficult to understand the visualisation out of the context in which the learning exchange has happened. If Double Loop learning is to happen, then learners need to reflect in order to make meanings. And refection occurs best, I think, in the context in which it takes place.

technical-challenges-for-realizing-learning-analytics-11-638

Image: Ralph Klamma – http://www.slideshare.net/klamma/technical-challenges-for-realizing-learning-analytics

There are alternatives to the dashboard. For instance with EmployID we are developing real time discourse analysis and are also looking at providing dynamic prompts for reflection.>One final point. If we are aiming at using Learning Analytics for Double Loop Learning we need to find out what works and what does not. That means that any measure for Learning Analytics needs to be accompanied by well designed evaluation measures. All too often because LA collects data, it presumes to cover evaluation. Whilst both LA and evaluation may share data, they aim at different things.

Lack of proxies a problem for Workplace Learning Analytics

May 3rd, 2016 by Graham Attwell
I’ve been spending a lot of time thinking about Learning Analytics lately and this is the first of four or five short posts on the subject. Its all been kicked off by attending the Society of Learning Analytics pre conference workshops last week – LAK16 – in Edinburgh. Sadly I couldn’t afford the time and money to go to both the workshops and the full conference but many of the presentations and papers from the conference are already viable online.
My interest in Learning Analytics stems from the EmployID project which is aiming to support scalable and cost-effective facilitation of professional identity transformation in public employment services. And in our project application under the EU Research Framework (Horizon 2020) we said we would research and develop Learning Analytics services for staff in Public Employment Services. Easier said than done! An early literature review revealed that despite present high levels of interest (hype?) in Learning Analytics in formal education there has been very little research and development in Workplace Learning Analytics: therefore my excitement at a workshop on this subject at LAK16. But sadly despite the  conference selling out with 400 attendees, we only had four papers submitted for the workshop and just 11 attendees. What this did allow was a lot of in-depth discussion, which has left me plenty of issues to think about. And of course one of the issues we discussed was why there is apparently so little interest in Workplace Learning Analytics. It was pointed out that there have been a number of work oriented presentations in previous LAK conferences but these had remained isolated with no real follow up and with no overall community emerging.
There was also a general feeling that the Learning Analytics community was weak in terms of learning theory and pedagogy, both of which were censored central to Workplace Learning Analytics. But perhaps most importantly Learning Analytics approaches in schools and Higher Education lean heavily on proxies for learning, for instance examination results and grades. With the lack of such proxies for learning in the workplace, Learning Analytics has to focus on real learning – usually in the absence of a Learning Management System. And that is simply very hard to design and develop.Yet having said that, most if not all of us in the workshop were convinced that the real future of Learning Analytics in in the workplace, with a focus on understanding learning including informal learning and improving both learning and the environment in which it occurs.
We agreed on some modest next steps and will be launching a LinkedIn Group in the near future. In the meantime the papers and presentation from the workshop can be found at http://learning-layers.eu/laforwork/.

Workplace Learning Analytics for Facilitation in European Public Employment Services

April 29th, 2016 by Graham Attwell

This week I have been at the pre-conference workshops for the Learning analytics conference in Edinburgh. This is my presentation at the workshop on Workplace Learning Analytics. And below is the abstract of my paper together with a link to download the full paper, if you should wish. In the next few days,  I will write up a reflection on the workshops, plus some new ideas that emerged from talking with participants.
Abstract

The paper is based on early research and practices in developing workplace Learning Analytics for the EU funded EmployID project, focused on identity transformation and continuing professional development in Public Employment Services (PES) in Europe. Workplace learning is mostly informal with little agreement of proxies for learning, driven by demands of work tasks or intrinsic interests of the learner, by self-directed exploration and social exchange that is tightly connected to processes and the places of work. Rather than focusing on formal learning, LA in PES needs to be based on individual and collective social practices and informal learning and facilitation processes rather than formal education. Furthermore, there are considerable concerns and restraints over the use of data in PES including data privacy and issues including power relations and hierarchies.

Following a consultation process about what innovations PES would like to pilot and what best meets their needs, PES defined priorities for competence advancement around the ‘resourceful learner’, self-reflection and self-efficacy as core competences for their professional identity transformation. The paper describes an approach based on Social Learning Analytics linked to the activities of the EmployID project in developing social learning including advanced coaching, reflection, networking and learning support services. SLA focuses on how learners build knowledge together in their cultural and social settings. In the context of online social learning, it takes into account both formal and informal educational environments, including networks and communities. The final section of the paper reports on work in progress to build a series of tools to embed SLA within communities and practices in PES organisations.

Download the paper (PDF)

Workplace Learning Analytics for Facilitation in European Public Employment Services

February 10th, 2016 by Graham Attwell

Along with colleagues from the EmployID project, I’ve submitted  a paper f to the workshop on Learning Analytics for Workplace and Professional Learning (LA for Work) at Learning Analytics and Knowledge Conference (LAK 2016) in April. Below is the text of teh paper (NB If you are interested, the orgnaisers are still accepting submissions for the workshop.

ABSTRACT

In this paper, we examine issues in introducing Learning Analytics (LA) in the workplace. We describe the aims of the European EmployID research project which aims to use

Image: Educause

technology to facilitate identity transformation and continuing professional development in European Public Employment Services. We describe the pedagogic approach adopted by the project based on social learning in practice, and relate this to the concept of Social Learning Analytics. We outline a series of research questions the project is seeking to explore and explain how these research questions are driving the development of tools for collecting social LA data. At the same time as providing research data, these tools have been developed to provide feedback to participants on their workplace learning.

1. LEARNING ANALYTICS AND WORK BASED LEARNING

Learning Analytics (LA) has been defined as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.” [1]. It can assist in informing decisions in education education system, promote personalized learning and enable adaptive pedagogies and practices [2].

However, whilst there has been considerable research and development in LA in the formal school and higher education sectors, much less attention has been paid to the potential of LA for understanding and improving learning in the workplace. There are a number of possible reasons for this.

Universities and schools have tended to harvest existing data drawn from Virtual Learning Environments (VLEs) and to analyse that data to both predict individual performance and undertake interventions which can for instance reduce drop-out rates. The use of VLEs in the workplace is limited and “collecting traces that learners leave behind” [3] may fail to take cognizance of the multiple modes of formal and informal learning in the workplace and the importance of key indicators such as collaboration. Once more key areas such as collaboration tend to be omitted and in focusing on VLEs, a failure to include all the different modes of learning. Ferguson [4]) says that in LA implementation in formal education: “LA is aligned with clear aims and there are agreed proxies for learning.” Critically, much workplace learning is informal with little agreement of proxies for learning. While Learning Analytics in educational settings very often follow a particular pedagogical design, workplace learning is much more driven by demands of work tasks or intrinsic interests of the learner, by self-directed exploration and social exchange that is tightly connected to processes and the places of work [5].  Learning interactions at the workplace are to a large extent informal and practice based and not embedded into a specific and measurable pedagogical scenario.

Pardo and Siemens [6] point out that “LA is a moral practice and needs to focus on understanding instead of measuring.” In this understanding “learners are central agents and collaborators, learner identity and performance are dynamic variables, learning success and performance is complex and multidimensional, data collection and processing needs to be done with total transparency.” This poses particular issues within the workplace with complex social and work structures, hierarchies and power relations.

Despite these difficulties workplace learners can potentially benefit from being exposed to their own and other’s learning processes and outcomes as this potentially allows for better awareness and tracing of learning, sharing experiences, and scaling informal learning practices [5]. LA can, for instance, allow trainers and L & D professionals to assess the usefulness of learning materials, increase their understanding of the workplace learning environment in order to improve the learning environment and to intervene to advise and assist learners. Perhaps more importantly,  it can assist learners in monitoring and understanding their own activities and interactions and participation in individual and collaborative learning processes and help them in reflecting on their learning.

There have been a number of early attempts to utilise LA in the workplace. Maarten de Laat [7] has developed a system based on Social Network Analysis to show patterns of learning and the impact of informal learning in Communities of Practice for Continuing Professional Development for teachers.

There is a growing interest in the use of MOOCs for professional development and workplace learning. Most (if not all) of the major MOOC platforms have some form of Learning Analytics built in providing both feedback to MOOC designers and to learners about their progress. Given that MOOCs are relatively new and are still rapidly evolving, MOOC developers are keen to use LA as a means of improving MOOC programmes.  Research and development approaches into linking Learning Design with Learning Analytics for developing MOOCs undertaken by Conole [8] and Ferguson [9] amongst others have drawn attention to the importance of pedagogy for LA.

Similarly, there are a number of research and development projects around recommender systems and adaptive learning environments. LA is seen as having strong relations to recommender systems [10], adaptive learning environments and intelligent tutoring systems [11]), and the goals of these research areas. Apart from the idea of using LA for automated customisation and adaptation, feeding back LA results to learners and teachers to foster reflection on learning can support self-regulated learning [12]. In the workplace sphere LA could be used to support the reflective practice of both trainers and learners “taking into account aspects like sentiment, affect, or motivation in LA, for example by exploiting novel multimodal approaches may provide a deeper understanding of learning experiences and the possibility to provide educational interventions in emotionally supportive ways.” [13].

One potential barrier to the use of LA in the workplace is limited data. However, although obviously smaller data sets place limitations on statistical processes, MacNeill [14] stresses the importance of fast data, actionable data, relevant data and smart data, rather than big data. LA, she says, should start from research questions that arise from teaching practice, as opposed to the traditional approach of starting analytics based on already collected and available data. Gasevic, Dawson and Siemens [15]  also draw attention to the importance of information seeking being framed within “robust theoretical models of human behavior” [16]. In the context of workplace learning this implies a focus on individual and collective social practices and to informal learning and facilitation processes rather than formal education. The next section of this paper looks at social learning in Public Employment Services and how this can be linked to an approach to workplace LA.

2. EMPLOYID: ASSISTING IDENTITY TRANSFORMATION THROUGH SOCIAL LEARNING IN EUROPEAN EMPLOYMENT SERVICES

The European EmployID research project aims to support and facilitate the learning process of Public Employment Services (PES) practitioners in their professional identity transformation process. The aims of the project are born out of a recognition that to perform successfully in their job they need to acquire a set of new and transversal skills, develop additional competencies, as well as embed a professional culture of continuous improvement. However, it is unlikely that training programmes will be able to provide sufficient opportunities for all staff in public employment services, particularly in a period of rapid change in the nature and delivery of such services and in a period with intense pressure on public expenditures. Therefore, the EmployID project aims to promote, develop and support the efficient use of technologies to provide advanced coaching, reflection and networking services through social learning. The idea of social learning is that people learn through observing others behaviour, attitudes and outcomes of these behaviours, “Most human behaviour is learned observationally through modelling from observing others, one forms an idea of how new behaviours are performed, and on later occasions this coded information serves as a guide for action” [17]. Facilitation is seen as playing a key role in structuring learning and identity transformation activities and to support networking in personal networks, teams and organisational networks, as well as cross-organisational dialogue.

Social Learning initiatives developed jointly between the EmployID project and PES organisations include the use of MOOCs, access to Labour Market information, the development of a platform to support the emergence of communities of practice and tools to support reflection in practice.

Alongside such a pedagogic approach to social learning based on practice the project is developing a strategy and tools based on Social Learning Analytics. Ferguson and Buckingham Shun [18] say that Social Learning Analytics (SLA) can be usefully thought of as a subset of learning analytics approaches. SLA focuses on how learners build knowledge together in their cultural and social settings. In the context of online social learning, it takes into account both formal and informal educational environments, including networks and communities. “As groups engage in joint activities, their success is related to a combination of individual knowledge and skills, environment, use of tools, and ability to work together. Understanding learning in these settings requires us to pay attention to group processes of knowledge construction – how sets of people learn together using tools in different settings. The focus must be not only on learners, but also on their tools and contexts.”

Viewing learning analytics from a social perspective highlights types of analytic that can be employed to make sense of learner activity in a social setting. They go on to introduce five categories of analytic whose foci are driven by the implications of the changes in which we are using social technology for learning [18]. These include social network analysis focusing on interpersonal relations in social platforms, discourse analytics predicated on the use of language as a tool for knowledge negotiation and construction, content analytics particularly looking at user-generated content and disposition analytics saying intrinsic motivation to learn is a defining feature of online social media, and lies at the heart of engaged learning, and innovation.

The approach to Social Learning Analytics links to the core aims of the EmployID project to support and facilitate the learning process of PES practitioners in their professional identity development by the efficient use of technologies to provide social learning including advanced coaching, reflection, networking and learning support services. The project focuses on technological developments that make facilitation services for professional identity transformation cost-effective and sustainable by empowering individuals and organisations to engage in transformative practices, using a variety of learning and facilitation processes.

3. LEARNING ANALYTICS AND EMPLOYID – WHAT ARE WE TRYING TO FIND OUT?

Clearly there are close links between the development of Learning Analytics and our approach to evaluation within EmployID. In order to design evaluation activities the project has developed a number of overarching research questions around professional development and identity transformations with Public Employment Services. One of these research questions is focused on LA: Which forms of workplace learning analytics can we apply in PES and how do they impact the learner? How can learning analytics contribute to evaluate learning interventions? Other focus on the learning environment and the use of tools for reflection, coaching and creativity as well as the role of the wider environment in facilitating professional identity transformation. A third focus is how practitioners manage better their own learning and gain the necessary skills (e.g. self-directed learning skills, career adaptability skills, transversal skills etc.) to support identity transformation processes as well as facilitating the learning of others linking individual, community and organizational learning.

These research questions also provide a high level framework for the development of Learning Analytics, embedded within the project activities and tools. And indeed much of the data collected for evaluation purposes also can inform Learning Analytics and vice versa. However, whilst the main aim of the evaluation work is measure the impact of the EmployID project and for providing useful formative feedback for development of the project’s tools and overarching blended learning approach, the Learning Analytics focus is understanding and optimizing learning and the environments in which it occurs.

4. FROM A THEORETICAL APPROACH TO DEVELOPING TOOLS FOR LA IN PUBLIC EMPLOYMENT SERVICES

Whilst the more practical work is in an initial phase, linked to the roll out of tools and platforms to support learning, a number of tools are under development and will be tested in 2016. Since this work is placed in the particular working environment of public administration, the initial contextual exploration led to a series of design considerations for the suggested LA approaches presented below. The access to fast, actionable, relevant and smart data is most importantly regulated by strict data protection and privacy aspects, that are crucial and clearly play a critical role in any workplace LA. As mentioned above power relations and hierarchies come into play and the full transparency to be aspired with LA might either be hindered by existing structures or raise expectations that are not covered by existing organisations structures and process. If efficient learning at the workplace becomes transparent and visible through intelligent LA, what does this mean with regard to career development and promotion? Who has access to the data, how are they linked to existing appraisal systems or is it perceived as sufficient to use the analytics for individual reflection only? For the following LA tools a trade-off needs to be negotiated and their practicality in workplace setting can only be assessed when fully implemented. Clear rules about who has access to the insight gained from LA have to be defined. The current approach in EmployID is thus to focus on the individual learner as the main recipient of LA.   

4.1 Self-assessment questionnaire

The project has developed a self-assessment questionnaire as an instrument to collect data from EmployID interventions in different PES organisations to support reflection on personal development. It contains a core set of questions for cross-case evaluation and LA on a project level as well as intervention-specific questions that can be selected to fit the context. The self-assessment approach should provide evidence for the impact of EmployID interventions whilst addressing the EmployID research questions, e.g. the effectiveness of a learning environment in the specific workplace context. Questions are related to professional identity transformation, including individual, emotional, relational and practical development. For the individual learner the questionnaire aims to foster their self-reflection process. It supports them in observing their ‘distance travelled’ in key aspects of their professional identity development. Whilst using EmployID platforms and tools, participants will be invited to fill in the questionnaire upon registration and then at periodic intervals. Questions and ways of presenting the questionnaire questions are adapted to the respective tool or platform, such as social learning programmes, reflective community, or peer coaching.

The individual results and distance travelled over the different time points will be visualised and presented to individual participants in the form of development curves based on summary categories to stimulate self-reflection on learning. These development curves show the individual learners’ changes in their attitudes and behaviour related to learning and adaptation  in the job, the facilitation of colleagues and clients, as well as the personal development related to reflexivity, stress management and emotional awareness.

4.2 Learning Analytics and Reflective Communities

The EmployID project is launching a platform to support the development of a Reflective in the Slovenian PES in February, 2016. The platform is based on the WordPress Content Management System and the project has developed a number of plug ins to support social learning analytics and reflection analytics. The data from these plugins can serve as the basis for a dashboard for learners providing visualisations of different metrics

4.2.1 Network Maps

This plugin visualizes user interactions in social networks including individual contacts, activities, and topics. The data is visualised through a series of maps and is localised through different offices within the PES. The interface shows how interaction with other users has changed during the last 30 days. This way users can visually “see” how often they interact with others and possibly find other users with whom they wish to interact.

The view can be filtered by different job roles and is designed to help users find topics they may be interested in.

4.2.2 Karma Points

The Karma Points plugin allows users to give each other ‘Karma points’ and ‘reputation points’. It is based both on rankings of posts and of authors. Karma points are temporary and expire after a week but are also refreshed each week. This way users can only donate karma points to a few selected posts in each week. The user who receives a karma point gets the point added to her permanent reputation points.

4.2.3 Reflection Analytics

The Reflection Analytics plugin collects platform usage data and shows it in an actionable way to users. The purpose of this is to show people information in order to let them reflect about their behaviour in the platform and then possibly to give them enough information to show them how they could learn more effectively. The plugin will use a number of different charts, each wrapped in a widget in order to retain customizability.

One chart being considered would visualise the role of the user’s interaction in the current month in terms of how many posts she wrote, how many topics she commented on and how many topics she read compared to the average of the group.  This way, users can easily identify whether they are writing a similar number of topics as their colleagues. It shows change over time and provides suggestions for new activities. However, we also recognise that comparisons with group averages can be demotivating for some people.

4.3 Content Coding and Analysis

The analysis of comments and content shared within the EmployID tools can provide data addressing a number of the research questions outlined above.

A first trial of content coding used to analyse inputs into a pilot MOOC held in early 2015 using the FutureLearn platform resulted in rich insights about aspects of identity transformation and learning from and with others. The codes for this analysis were created inductively based on [19] and then analysed according to success factors for identity transformation. Given that identity transformation in PES organisations is a new field of research we expect new categories to evolve over time.

In addition to the inductive coding the EmployID project will apply deductive analysis to investigate the reflection in content of the Reflective Community Platform following a fixed coding scheme for reflection [20].

Similar to the coding approach applied for reflective actions we are currently working on a new coding scheme for learning facilitation in EmployID. Based on existing models of facilitation (e.g. [21]) and facilitation requirements identified within the PES organisations, a fixed scheme for coding will be developed and applied the first time for the analysis of content shared in the Reflective Community platform.

An important future aspect of content coding is going one step further and exploring innovative methodological approaches, trialing with a machine learning approach based on (semi-) automatic detection of reflection and facilitation in text. This semi-automatic content analysis is a prerequisite for reflecting analysis back to learners as part of learning analytics, as it allows the analysis of large amounts of shared content, in different languages and not only ex-post, but continually in real time.

4.4 Dynamic Social Network Analysis

Conceptual work being currently undertaken aims to bring together Social Network Analysis and Content Analysis in an evolving environment in order to analyze the changing nature and discontinuities in a knowledge development and usage over time. Such a perspective would not only enable a greater understanding of knowledge development and maturing within communities of practice and other collaborative learning teams, but would allow further development and improvements to the (online) working and learning environment.

The methodology is based on various Machine Learning approaches including content analysis, classification and clustering [22], and statistical modelling of graphs and networks with a main focus on sequential and temporal non-stationary environments [23].

To illustrate changes of nature and discontinuities at the level of social network connectivity and content of communications in a knowledge maturing process “based on the assumption that learning is an inherently social and collaborative activity in which individual learning processes are interdependent and dynamically interlinked with each other: the output of one learning process is input to the next. If we have a look at this phenomenon from a distance, we can observe a knowledge flow across different interlinked individual learning processes. Knowledge becomes less contextualized, more explicitly linked, easier to communicate, in short: it matures.” [24]

5. NEXT STEPS

In this paper we have examined current approaches to Learning Analytics and have considered some of the issues in developing approaches to LA for workplace learning, notably that learning interactions at the workplace are to a large extent informal and practice based and not embedded into a specific and measurable pedagogical scenario. Despite that, we foresee considerable benefits through developing Workplace Learning Analytics in terms of better awareness and tracing of learning, sharing experiences, and scaling informal learning practices.

We have outlined a pedagogic approach to learning in European Public Employment Services based on social learning and have outlined a parallel approach to LA based on Social Learning Analytics. We have described a number of different tools for workplace Learning Analytics aiming at providing data to assist answering a series of research questions developed through the EmployID project. At the same time as providing research data, these tools have been developed to provide feedback to participants on their workplace learning.

The tools are at various stages of development. In the next phase of development, during 2016, we will implement and evaluate the use of these tools, whilst continuing to develop our conceptual approach to Workplace Learning Analytics.

One essential part of this conceptual approach is that supporting learning of individuals with learning analytics is not just as designers of learning solutions how to present dashboards, visualizations and other forms of data representation. The biggest challenge of workplace learning analytics (but also learning analytics in general) is to support learners in making sense of the data analysis:

  1. What does an indicator or a visualization tell about how to improve learning?
  2. What are the limitations of such indicators?
  3. How can we move more towards evidence-based interventions

And this is not just a individual task; it requires collaborative reflection and learning processes. The knowledge of how to use learning analytics results for improving learning also needs to evolve through a knowledge maturing process. This corresponds to Argyris & Schön’s double loop learning [25]. Otherwise, if learning analytics is perceived as a top-down approach pushed towards the learner, it will suffer from the same problems as performance management. These pre-defined indicators (through their selection, computation, and visualization) implement a certain preconception which is not evaluated on a continuous basis by those involved in the process. Misinterpretations and a misled confidence in numbers can disempower learners and lead to an overall rejection of analytics-driven approaches.

ACKNOWLEDGEMENTS

EmployID – “Scalable & cost-effective facilitation of professional identity transformation in public employment services” – is a research project supported by the European Commission under the 7th Framework Program (project no. 619619).

REFERENCES

[1] SoLAR(2011).Open Learning Analytics: An Integrated & Modularized Platform. WhitePaper.Society for Learning Analytics Research. Retrieved from http://solaresearch.org/OpenLearningAnalytics.pdf

[2] Johnson, L. Adams Becker, S., Estrada, V., Freeman, A. (2014). NMC Horizon Report: 2014 Higher Education Edition. Austin, Texas: The New Media Consortium

[3] Duval E. (2012) Learning Analytics and Educational Data Mining, Erik Duval’s Weblog, 30 January 2012,  https://erikduval.wordpress.com/2012/01/30/learning-analytics-and-educational-data-mining/

[4] Ferguson, R. (2012) Learning analytics: drivers, developments and challenges. In: International Journal of Technology Enhanced Learning, 4(5/6), 2012, pp. 304-317.

[5] Ley T. Lindstaedt S., Klamma R. and Wild S. (2015) Learning Analytics for Workplace and Professional Learning, http://learning-layers.eu/laforwork/

[6] Pardo A. and Siemens G. (2014) Ethical and privacy principles for learning analytics in British Journal of Educational Technology Volume 45, Issue 3, pages 438–450, May 2014

[7] de Laat M. & Schreurs (2013) Visualizing Informal Professional Development Networks: Building a Case for Learning Analytics in the Workplace, In American Bahavioral Scientist http://abs.sagepub.com/content/early/2013/03/11/0002764213479364.abstract

[8] Conole G. (2014) The implciations of open practice, presentation, Slideshare, http://www.slideshare.net/GrainneConole/conole-hea-seminar

[9] Ferguson (2015) Learning Design and Learning Analytics, Presentation, Slideshare http://www.slideshare.net/R3beccaF/learning-design-and-learning-analytics-50180031

[10] Adomavicius, G. and Tuzhilin, A. (2005) Toward the Next Generation of Recommender Systems: A Survey of the State-of-the-Art and Possible Extensions. IEEE Transactions on Knowledge and Data Engineering, 17, 734-749. http://dx.doi.org/10.1109/TKDE.2005.99

[11] Brusilovsky, P. and Peylo, C. (2003) Adaptive and intelligent Web-based educational systems. In P. Brusilovsky and C. Peylo (eds.), International Journal of Artificial Intelligence in Education 13 (2-4), Special Issue on Adaptive and Intelligent Web-based Educational Systems, 159-172.

[12] Zimmerman B. J, (2002) Becoming a self-regulated learner: An overview, in Theory into Practice, Volume: 41 Issue: 2 Pages: 64-70

[13] Bahreini K, Nadolski & Westera W. (2014) Towards multimodal emotion recognition in e-learning environments, Interactive Learning environments, Routledge, http://www.tandfonline.com/doi/abs/10.1080/10494820.2014.908927

[14] MacNeill, S. (2015) The sound of learning analytics, presentation, Slideshare, http://www.slideshare.net/sheilamac/the-sound-of-learning-analytics

[15] Gašević, D., Dawson, S., Siemens, G. (2015) Let’s not forget: Learning Analytics are about learning. TechTrends

[16] Wilson, T. D. (1999). Models in information behaviour research. Journal of Documentation, 55 (3), pp 249-70

[17] Bandura, A. (1977). Social Learning Theory. Englewood Cliffs, NJ: Prentice Hall.

[18] Buckingham Shum, S., & Ferguson, R. (2012). Social Learning Analytics. Educational Technology & Society, 15 (3), 3–26

[19] Mayring, P. (2000). Qualitative Content Analysis. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 1(2). Retrieved from http://nbn-resolving.de/urn:nbn:de:0114-fqs0002204

[20] Prilla M, Nolte A, Blunk O, et al (2015) Analyzing Collaborative Reflection Support: A Content Analysis Approach. In: Proceedings of the European Conference on Computer Supported Cooperative Work (ECSCW 2015).   

[21] Hyland, N., Grant, J. M., Craig, A. C., Hudon, M., & Nethery, C. (2012). Exploring Facilitation Stages and Facilitator Actions in an Online/Blended Community of Practice of Elementary Teachers: Reflections on Practice (ROP) Anne Rodrigue Elementary Teachers Federation of Ontario. Copyright© 2012 Shirley Van Nuland and Jim Greenlaw, 71.   

[22] Yeung, K. Y. and Ruzzo W.L. (2000). An empirical study on principal component analysis for clustering gene expression data. Technical report, Department of Computer Science and Engineering, University of Washington.http://bio.cs.washington.edu/supplements/kayee/pca.pdf

[23] Mc Culloh, I. and Carley, K. M. (2008). Social Network Change Detection. Institute for Software Research. School of Computer Science. Carnegie Mellon University. Pittsburgh, PA 15213. CMU-CS-08116.

http://www.casos.cs.cmu.edu/publications/papers/CMU-CS-08-116.pdf

[24] R. Maier, A. Schmidt. Characterizing Knowledge Maturing: A Conceptual Process Model for Integrating E-Learning and Knowledge Management In: Gronau, Norbert (eds.): 4th Conference Professional Knowledge Management – Experiences and Visions (WM ’07), Potsdam, GITO, 2007, pp. 325-334.

http://knowledge-maturing.com/concept/knowledge-maturing-phase-model

[25] Argyris, C./ Schön, D. (1978): Organizational Learning: A theory of action perspective. Reading.

The challenge for Learning Analytics: Sense making

January 28th, 2016 by Graham Attwell

https://sylviamoessinger.files.wordpress.com/2012/06/learninganalytics_chalkboard.jpg

Image: Educause

Its true Twitter can be a distraction. But it is an unparalleled  resource for new ideas and learning about things you didn’t know you wanted to learn about. This morning my attention was drawn by a Tweet linking to a interview in Times Higher Education with Todd Rose entitled “taking on the ‘averagarians’.” Todd Rose believes that “more sophisticated examples of “averagarian” fallacies – making decisions about individuals on the basis of what an idealised average person would do – are causing havoc all round.” The article suggests that this applies to higher education giving the example that “Universities assume that an average student should learn a certain amount of information in a certain amount of time. Those who are much quicker than average on 95 per cent of their modules and slower than average on 5 per cent may struggle to get a degree.”

It seems to me that this is one of the problems with Data Analytics. It may or may not matter that an individual is doing better or worse than the average in a class or that they spend more or less time reading or even worse logged on to the campus VLE. Its not that this data isn’t potentially useful but it is what sense to make of it. I’m currently editing a paper for submission to the workshop on Learning Analytics for Workplace and Professional Learning (LA for Work) at Learning Analytics and Knowledge Conference (LAK 2016) in April (I will post a copy of the paper here on Sunday). And my colleague Andreas Schmidt has contributed what I think is an important paragraph:

Supporting the learning of individuals with learning analytics is not just as designers of learning solutions how to present dashboards, visualizations and other forms of data representation. The biggest challenge of workplace learning analytics (but also learning analytics in general) is to support learners in making sense of the data analysis:

  • What does an indicator or a visualization tell about how to improve learning?
  • What are the limitations of such indicators?
  • How can we move more towards evidence-based interventions

And this is not just a individual task; it requires collaborative reflection and learning processes. The knowledge of how to use learning analytics results for improving learning also needs to evolve through a knowledge maturing process. This corresponds to Argyris & Schön’s double loop learning. Otherwise, if learning analytics is perceived as a top-down approach pushed towards the learner, it will suffer from the same problems as performance management. These pre-defined indicators (through their selection, computation, and visualization) implement a certain preconception which is not evaluated on a continuous basis by those involved in the process. Misinterpretations and a misled confidence in numbers can disempower learners and lead to an overall rejection of analytics-driven approaches.

Learning Analytics for Workplace and Professional Learning

December 10th, 2015 by Graham Attwell

Whilst there has been much interest from the Technology Enhanced Learning research and development community in Learning Analytics  (LA) , most of the focus has been on formal learning in educational institutions. There has been relatively little written about work based learning and continuing professional development, let alone informal learning. And what evidence there is suggests that Learning Analytics applied to informal and work based learning may require a significantly different approach to the emerging mainstream LA for schools and universities. Amongst other issues, data sets may be significantly smaller, learners are less concerned about completing learning in a set time to well defined assessment metrics and may have significant concerns about data privacy. Perhaps most significantly the answers we are trying to find out about learning in the workplace may not be the same as within an educational institution.

For this reason it is very good to see the organisation of a Learning Analytics for Workplace and Professional Learning (LA for Work) workshop collocated with the
Learning Analytics and Knowledge Conference (LAK 2016) at University of Edinburgh, Edinburgh, UK, being held in April 2016. Below is the Call for Papers.

Learning Analytics have been striving in the past years for all types of educational settings. However, analytics for workplace learning has been much less in the focus of the learning analytics community. While Learning Analytics in educational settings very often follow a particular pedagogical design, workplace learning is much more driven
by demands of work tasks or intrinsic interests of the learner, by self-directed exploration and social exchange that is tightly connected to processes and the places of work. Hence, learning interactions at the workplace are to a large extent informal and not embedded into a pedagogical scenario. At the same time, workplace learners can benefit from being exposed to their own and other’s learning processes and outcomes as this potentially allows for better awareness and tracing of learning, sharing experiences, and scaling
informal learning practices.

Recently, several different approaches to Learning Analytics in the workplace have been suggested. Some of these have been coming from the tradition of adaptive learning systems or self-directed learning environments for workplace learning or lifelong learning, some from learning in professional communities. Recently, the topic of
performance analytics or analytics in smart industries has extended the focus to more traditional work settings. New research challenges also abound in workplace scenarios, such as the introduction of new technologies (augmented interfaces, large scale collaboration platforms), or the new challenges that derive from the need to make
informal learning processes better traceable and recognizable.

We consider that workplace learning scenarios can benefit from existing research in education-based Learning Analytics approaches and technologies. At the same time, we are convinced that the community would benefit from a closer exchange around the specificities of workplace learning, such as the unconstrained and less plannable
learning processes, the challenge to integrate learning systems in work practices, or a methodological focus on design-oriented research approaches with smaller samples in real life settings. At the same time, we think that researchers in the educational domain can benefit from this workshop at LAK, as the clear boundaries between formal and
informal learning are increasingly vanishing, and a focus on lifelong learning is increasingly being established. For this reason, the LA for Work workshop aims at providing a forum for researchers and practitioners who are making innovative use of analytics at the workplace, and for those who have an interest in exploring analytics
in more informal learning settings.

Objectives and Topics

The objective of this workshop is to provide a forum for researchers in the area of learning analytics who specifically address learning at the workplace or in professional settings in different forms and flavors. We will welcome high-quality papers about actual trends related to workplace Learning Analytics. We will seek application oriented, as well as more theoretical papers and position papers in the following, non-exhaustive list of topics:

Learning Analytics for informal learning
Learning Analytics for the integration of formal and informal learning
Learning Analytics for workplace performance
Learning Analytics for lifelong learning
Community-Based Learning Analytics
Learning Analytics for Organisational Learning
Learning Analytics in professional communities
Workplace learning awareness, measurement and certification
Analytics for different educational processes at or near the
workplace, such as problem-based learning, on-the-job training,
self-directed informal learning, collaborative learning
Knowledge maturing in communities of practice, organisations or networks
Data-driven interventions to improve learning processes at the workplace
Recomendations of learning artifacts or learning activities at the workplace

DATES & SUBMISSION

We welcome the following types of contributions:

Short research papers and position papers (up to 4 pages)

Full research papers (up to 6 pages)

All submissions must be written in English and must be formatted according to the ACM format. Please, submit your contributions electronically in PDF format via
https://easychair.org/conferences/?conf=laforwork2016

Workplace Learning Analytics

June 16th, 2015 by Graham Attwell

EmployID is an EU-funded, four-year project which aims to support Public Employment Services staff to develop competences that address the need for integration and activation of job seekers in fast changing labour markets. According to the official flyer: “It builds upon career adaptability and resilience in practice, including quality and evidence- based frameworks for enhanced individual and organisational learning. It also supports the learning process of PES practitioners and managers in their professional identity development by supporting the efficient use of technologies to provide advanced coaching, reflection, networking and learning support services as well as MOOCs.”

One of the aims for research and development is to introduce the use of Learning Analytics within Public Employment Services. Although there is great interest in Learning Analytics by L and D staff, there are few examples of how Learning Analytics might be implanted in the workplace. Indeed looking at research reported by the Society for Learning Analytics Research reveals a paucity of attention to the workplace as a learning venue.

In this video, Graham Attwell proposes an approach to Workplace Learning Analytics based on the Social Learning Platform model (see diagram) adopted by the Employ ID project. He argues that rather merely fathering together possible data and then trying to work out what to do with it, data needs to be sought which can answer well designed research questions aiming to improve the quality of learning and the learning environment. socialllearningplatform

 

In the case of EmployID these questions could be linked to the six different foci of the Social Learning Platform, namely:

  • Support for facilitation roles
  • Structuring identity transformation activities
  • Supporting networking in personal networks
  • Supporting organisational networks
  • Supporting cross organisational dialogue
  • Providing social networking facilitation
  • Supporting networking in teams

For some of these activities we already have collected some “docital traces” for instance data on facilitation roles through within a pilot MOOC. In other cases we will have to think how best to develop tools and approaches to data gathering, both qualitative and quantitative.

The video has been produced to coincide with the launch of The Learning Analytics Summer Institute, a strategic event, co-organized by SoLAR and host institutions and by a global network of LASI-Locals who are running their own institutes.

Workplace Learning and Learning Analytics

April 15th, 2015 by Graham Attwell

I have been looking hard at Learning Analytics in the last month. In particular, as part of the European EmployID project application, as a bit of a not really thought through objective, we said we would experiment with the use of Learning Analytics in European Public Employment Services. this raises a series of issues which I will come back to in future ports. It seems to me that whilst there is much talk around the potential of  Learning Analytics in the workplace, there is very limited research and actual applications.

One of the reasons for this is that so much learning in the workplace in informal. As Boud and Hager (2012) say:

learning is a normal part of working, and indeed most other social activities. It occurs through practice in work settings from addressing the challenges and problems that arise. Most learning takes place not through formalized activities, but through the exigencies of practice with peers and others, drawing on expertise that is accessed in response to need. Problem-solving in which participants tackle challenges which progressively extend their existing capabilities and learn with and from each other appears to be common and frequent form of naturalistic development.

I would also add that much workplace learning is also driven through personal interest – a fact that is largely ignored and which has considerable economic implications in terms of workplace competence development. Although we can dream of a world where water cooler conversations are recorded by smart devices and sensors and added to other traces of digital activity, I am not sure this is a desirable outcome. So we have a challenge. most (university and formal education based) learning analytics focus on analysing digital interactions in, for example, a VLE. How can we sensibly and ethically extend data capture and analysis to informal workplace learning?

Learning Analytics

March 23rd, 2015 by Graham Attwell

Intro to learning analytics universities scotland_dec2014_smn from Sheila MacNeill

I am getting increasingly interested in Learning Analytics. But the more I think about it, the more questions I have. I am impressed with the Jisc project on Learning Analytics on which this presentation is based.

  • Search Pontydysgu.org

    Social Media




    News Bites

    Cyborg patented?

    Forbes reports that Microsoft has obtained a patent for a “conversational chatbot of a specific person” created from images, recordings, participation in social networks, emails, letters, etc., coupled with the possible generation of a 2D or 3D model of the person.


    Racial bias in algorithms

    From the UK Open Data Institute’s Week in Data newsletter

    This week, Twitter apologised for racial bias within its image-cropping algorithm. The feature is designed to automatically crop images to highlight focal points – including faces. But, Twitter users discovered that, in practice, white faces were focused on, and black faces were cropped out. And, Twitter isn’t the only platform struggling with its algorithm – YouTube has also announced plans to bring back higher levels of human moderation for removing content, after its AI-centred approach resulted in over-censorship, with videos being removed at far higher rates than with human moderators.


    Gap between rich and poor university students widest for 12 years

    Via The Canary.

    The gap between poor students and their more affluent peers attending university has widened to its largest point for 12 years, according to data published by the Department for Education (DfE).

    Better-off pupils are significantly more likely to go to university than their more disadvantaged peers. And the gap between the two groups – 18.8 percentage points – is the widest it’s been since 2006/07.

    The latest statistics show that 26.3% of pupils eligible for FSMs went on to university in 2018/19, compared with 45.1% of those who did not receive free meals. Only 12.7% of white British males who were eligible for FSMs went to university by the age of 19. The progression rate has fallen slightly for the first time since 2011/12, according to the DfE analysis.


    Quality Training

    From Raconteur. A recent report by global learning consultancy Kineo examined the learning intentions of 8,000 employees across 13 different industries. It found a huge gap between the quality of training offered and the needs of employees. Of those surveyed, 85 per cent said they , with only 16 per cent of employees finding the learning programmes offered by their employers effective.


    Other Pontydysgu Spaces

    • Pontydysgu on the Web

      pbwiki
      Our Wikispace for teaching and learning
      Sounds of the Bazaar Radio LIVE
      Join our Sounds of the Bazaar Facebook goup. Just click on the logo above.

      We will be at Online Educa Berlin 2015. See the info above. The stream URL to play in your application is Stream URL or go to our new stream webpage here SoB Stream Page.

  • Twitter

  • Recent Posts

  • Archives

  • Meta

  • Categories