Archive for the ‘learning Analytics’ Category

More thoughts on Workplace Learning Analytics

April 18th, 2017 by Graham Attwell

termination-110301_1920I have been looking at the potential of Learning Analytics (LA) for professional development of employees in European Public Employment services as part of the European funded EmployID project. Despite interest, particularly from Learning and Development personnel within the employment services, Learning Analytics, has made only limited impact and indeed reflects the slow take up of LA in the workplace as a whole.

The reasons for this are myriad. Universities and schools have tended to harvest existing data drawn from Virtual Learning Environments (VLEs) and to analyse that data to both predict individual performance and undertake interventions which can for instance reduce drop-out rates. The use of VLEs in the workplace is limited and “collecting traces that learners leave behind” (Duval, 2012) may fail to take cognizance of the multiple modes of formal and informal learning in the workplace and the importance of key indicators such as collaboration. Ferguson (2012) says that in LA implementation in formal education: “LA is aligned with clear aims and there are agreed proxies for learning.” The most commonly agreed proxy of learning achievement is achievement of outcomes in terms of examinations and assignments. Yet in the workplace, assignment driven learning plays only a limited role, mostly in formal courses and initial vocational education and training.

Workplace learning is driven by demands of work tasks or intrinsic interests of the learner, by self-directed exploration and social exchange that is tightly connected to processes and the places of work (Ley at al, 2015). Learning interactions at the workplace are to a large extent informal and practice based and not embedded into a specific and measurable pedagogical scenario.

In present Learning Analytics developments, there appears to be a tension between measuring and understanding. Pardo and Siemens (2014) say “learners are central agents and collaborators, learner identity and performance are dynamic variables, learning success and performance is complex and multidimensional, data collection and processing needs to be done with total transparency.” This poses particular issues within the workplace with complex social and work structures, hierarchies and power relations.

Despite these difficulties we remain convinced of the potential value of Learning Analytics in the workplace and in Public Employment Service organisations. If used creatively, Learning Analytics can assist learners in monitoring and understanding their own activities and interactions and participation in individual and collaborative learning processes and help them in reflecting on their learning. Furthermore, LA offers a potential approach to gaining rapid feedback to trainers and learning designers and data can be a tool for researchers in gaining a better understanding of learning processes and learning environments.

There is some limited emerging research into Workplace Learning Analytics and Social Learning analytics which offer at least pointers towards developing on such potential. Social Learning Analytics (SLA) can be usefully thought of as a subset of learning analytics approaches. SLA focuses on how learners build knowledge together in their cultural and social settings, taking into account both formal and informal learning environments, including networks and communities. Buckingham Shum, S., & Ferguson, R., (2012) suggest social network analysis focusing on interpersonal relations in social platforms, discourse analytics predicated on the use of language as a tool for knowledge negotiation and construction, content analytics particularly looking at user-generated content and disposition analytics can be developed to make sense of learning in a social setting.

Such an approach to Social Learning Analytics links to the core aims of the EmployID project to support and facilitate the learning process of PES practitioners in their professional identity development by the efficient use of technologies to provide social learning including advanced coaching, reflection, networking and learning support services. The project focuses on technological developments that make facilitation services for professional identity transformation cost-effective and sustainable by empowering individuals and organisations to engage in transformative practices, using a variety of learning and facilitation processes.

It should also be noted that although Learning Analytics has been linked to the collection and analysis of ‘big data’, MacNeill (2016) stresses the importance of fast data, actionable data, relevant data and smart data. LA, she says, should start from research questions that arise from teaching practice, as opposed to the more common approach of starting analytics based on already collected and available data.

Learning Analytics has been the subject on ongoing discussion in the EmployID project and particularly with the PES organisations. Although a number of PES organisations are interested in the possibility of adopting LA, it is not a major priority for them at present and they are aware of the constraints outlined above. Our initial experiences with sentiment analysis confirm this general interest as well as its limitations with public organisations. It has also became apparent that there are major overlaps between the Social Analytics approach and the tools and approaches we have been developing for evaluation. Our work in evaluation encompasses looking at interpersonal relations in social platforms, discourse analytics based on the EmployID MOOCs as well as learners own mapping of their progress through the self-assessment questionnaire.

We recognise that this data can be valuable for PES employees in supporting reflection on learning. But rather than seeking to develop a separate dashboard for reporting on data, we are attempting to embed representations of learning within the context in which the learning takes place. Thus, the social platform allows users to visualise their different interactions through the platform. Other work, such the facilitation coding scheme, does not yet allow real time analytics. But if proven successful as a research approach to understanding and supporting learning, then it could potentially be automated or semi-automated to provide such real time feedback.

Machines against humans?

March 17th, 2017 by Graham Attwell

I expect to see more of this debate in the future. Richard Palmer (Tribal) and Sheila MacNeil (Glasgow Caledonian University) had a debate at the Jisc Digifest17 about whether learning analytics interventions should always be mediated by a human being. Richard (for machines) and Sheila (for humans), speak about their thoughts on the topic in the Digifest studio with Robert and Louisa.

Learning Analytics and the Peak of Inflated Expectations

January 15th, 2017 by Graham Attwell

hypecycleHas Learning Analytics dropped of the peak of inflated expectations in Gartner’s hype cycle?  According to Educause ‘Understanding the power of data’ is still there as a major trend in higher education and Ed Tech reports a KPMG survey which found that 41 percent of universities were using data for forecasting and predictive analytics.

But whilst many universities are exploring how data can be used to improve retention and prevent drop outs, there seems little pretence any more that Learning Analytics has much to do with learning. The power of data has somehow got muddled up with Management Analytics, Performance Analytics and all kinds of other analytics – but the learning seems to have been lost. Data mining is great but it needs a perspective on just what we are trying to find out.

I don’t think Learning analytics will go into the trough of despair. But i think that there are very real problems in working out how best we can use data – and particularly how we can use  data to support learning. Learning analytics need to be more solidly grounded in what is already known about teaching and learning. Stakeholders, including teachers, learners and the wider community, need to be involved in the development and implementation of learning analytics tools. Overall, more evidence is needed to show which approaches work in practice and which do not.

Finally, we already know a great deal about formal learning in institutions, or at least by now we should do. Of course we need to work at making it better. But we know far less about informal learning and learning which takes place in everyday living and working environments. And that is where I ultimately see Learning analytics making a big difference. Learning Analytics could potentially help us all to self directed learners and to achieve the learning goals that we set ourselves. But that is a long way off. Perhaps if Learning analytics is falling off the peak of expectations that will provide the space for longer term more clearly focused research and development.

 

Learning Analytics for Workplace and Professional Learning

September 19th, 2016 by Graham Attwell

There is a small but growing community emerging in Europe around the potential and use of Learning analytics in the workplace. And there are still places available for a free workshop entitled ‘Learning Analytics for Workplace and Professional Learning’ on Friday 23 September in Leeds in the UK.

The workshop is being organised around two main activities.During the morning all the participants will shortly present their research work or the research questions they are working on. Thus, we can find common problems, synergies and potential collaborations in future projects. During the afternoon we can work on smaller groups to discuss some community building mechanisms: follow-up workshops (maybe at LAK’17), potential grant applications, creation of an on-line community, collaboration in current or future projects, collaboration with other researchers or research communities.

More information can be found in this PDF document. And to register, please send an email to: Tamsin Treasure-Jones (t [dot] treasure-jones [at] leeds [dot] ac [dot] uk)

Workplace Learning Analytics workshop

September 7th, 2016 by Graham Attwell

It is not easy developing a community around Workplace Learning Analytics but there are some signs of emerging interest.

On 23 September 2016, in Leeds, there is an open workshop on Workplace Learning Analytics, sponsored by the Learning Layers project. The invitation to the workshop runs as follows.

Up to now, the Learning Analytics community has not paid much attention to workplace and professional scenarios, where, in contrast to formal education, learning is driven by demands of work tasks and intrinsic interest of the learner. Hence, learning processes at the workplace are typically informal and not related to a pedagogical scenario. But workplace and professional learners can benefit from their own and others’ learning outcomes as they will potentially increase learning awareness, knowledge sharing and scaling-up learning practices.

Some approaches to Learning Analytics at the workplace have been suggested. On the one hand, the topic of analytics in smart industries has extended its focus to some learning scenarios, such as how professionals adopt innovations or changes at the workplace; on the other hand, the Learning Analytics community is increasing its attention to informal scenarios, professional training courses and teaching analytics. For this reason, we consider that it a great moment to collect all this interest to build up a community related to workplace and professional Learning Analytics.

  • We invite you to take part in the workshop which will have two main activities:
    During the morning all the participants will shortly present their research work or the research questions they are working on. Thus, we can find common problems, synergies and potential collaborations in future projects.
  • During the afternoon we can work on smaller groups to discuss some community-building mechanisms: follow-up workshops (maybe at LAK’17), potential grant applications, creation of an on-line community, collaboration in current or future projects, collaboration with other researchers or research communities… etc.

To register, please send an email to: Tamsin Treasure-Jones (t [dot] treasure-jones [at] leeds [dot] ac [dot] uk)

Intersections or Contradictions?

September 7th, 2016 by Graham Attwell

I like this presentation by Paul Prinsloo in that it draws out the differing motivations and pressures for developing Learning Analytics. But I wonder if Learning Analytics is at the intersections of thse different pressures – or rather if it exposes the contradictions facing the future of education today>

Learning Analytics for Vocational Education and Training

July 25th, 2016 by Graham Attwell

I have just spent an hour or so on a periodic search for research and development about Workplace Learning Analytics and the use of Learning Analytics in the public sector. As usual the results are pretty thin – although I think the slowly growing interest in the links between learning design and Learning Analytics may come in handy in the future.

The one thing which did interest me was a report of a workshop organised by Jisc with Greater Manchester Chamber of Commerce “to explore significant opportunities for improving access to data, and analytical capacity, in the face of the significant changes that are taking place across further education and training, skills commissioning and apprenticeship provision and funding.”

Of course, some of this is very UK specific in terms of the commissioning and funding models and are focused particularly at Further Education institutions. But some of the approaches  would appear more transferable for work based learning in general.

As part of the Further Education initiative, Jisc have been developing a series of user stories. The breadth and depth of the stories were extended at the workshop. Paul Bailey from the Jisc Learning Analytics project explains: “The user stories that were prioritised were around using analytics to

  • help learners to improve retention, achievement of grades and make informed decisions regarding their next destinations
  • improve the quality of learning and teaching, including looking at the curriculum design and use of rich content in online learning
  • improve college support processes to improve retention and provide effective careers service to support progression
  • understand the employer demand to better plan curriculum and recruitment
  • track finance and quality to remain competitive.”

In general these priorities apply across the initial vocational and education training sector (particularly for apprenticeships). However they don’t really work for public sector organisations which are largely focused on continuing training and professional development and which have different institutional and organisations aims and purposes than vocational colleges. But I like the story telling approach which could be a good way of exploring the potential of Learning Analytics in these organisations.

New book: Empowering Change in European Public Employment Services

July 18th, 2016 by Graham Attwell

employid bookReaders familiar with European Research projects will know how they work. The projects negotiate with the European Commission a DOW – Description of Work – which details the work to be undertaken in each year of the project. It is divided into discrete work packages. Every year the work package provides a (usually over lengthy) report on research and development undertaken which is then presented to a team of expert reviewers who can ‘pass’, recommend changes or ‘fail’ the report. Although obviously large scale multi national research projects need structures and plans. But all too often the work package structure separates research and development activities which should not be separated and the DOW become a restrictive ‘bible’, rather than a guide for action. And despite the large amount of work which goes into preparing the work package reports, they are seldom widely read (if indeed read at all), except by the reviewers.

In the EmployID project which is working with identity transformation in European Public Employment Services (PES), we are doing things differently. The work is structured though cross work package teams, who follow an adapted SCRUM structure. The teams are reviewed at face to face meetings and recomposed if necessary. And this year, instead of producing a series of Work package reports, the project partners have jointly contributed to a book – Empowering Change in Public Employment Services: The EmployID Approach which has just been published and can be downloaded for free.

The introduction to the 244 page PDF book explains the background to the work:

European Public Employment Services (PES) and their employees are facing fundamental challenges to the delivery of efficient and effective services and the need to change their strategies to combat high unemployment, demographic change in increasingly uncertain and dynamic labour markets. This does not only require developing new professional skills related to new tasks, but poses for more profound developmental challenges for staff members.

Three of these changes relate to understanding the changing world of work; a ‘turn’ towards coaching; and the increased importance of relations with employers. The staff need to learn new ways of working, with a major challenge being to enhance the power of collaborative (peer) learning in order to support staff in accomplishing their goals.

All these changes are linked to transforming professional identity, which requires learning on a deeper level that is often neglected by continuing professional development strategies. EmployID makes its contribution here; that PES practitioners’ learning related to professional identity transformation needs to be facilitated through social learning approaches and the nurturing of social learning networks, which include the following:

  • Reflection as a way of turning one’s own and others’ experiences into general insights on multiple levels, both from an individual and a collective perspective

  • Peer coaching as a way of helping learners in changing their behavior through a structured process

  • Social learning programmes as a way of engaging learners with new topics, other perspectives, and conversations around it.

Tensions in Learning Analytics

May 27th, 2016 by Graham Attwell

The debate around Learning Analytics seems to be opening up. And although there is little sign of agreement over future directions, the terms of discussion seem both broader and more nuanced than previously. I think some of this is in response to the disillusionment of early researchers and adopters.

In yesterdays OLDaily, Stephen Downes pointed to an excellent article by Bodong Chen. Bodong points to the surge of interest in Learning Analytics but cautions that: “The surge of this nascent field rests on a promise–and also a premise–that digital traces of learning could be turned into actionable knowledge to promote learning and teaching.

He suggests that: “One approach to understanding learning analytics is to recognize what are not learning analytics” including academic analytics and educational data mining. Instead, he says “learning analytics is more directly concerned with teachers and learners by attending to micro-patterns of learning.”

Bodong draws attention to a tension between learning and analytics “as two pivotal concepts of the field” He points out that “learning analytics deals with educational phenomena at multiple levels”. As an example he says: “collaborative knowledge building as a group phenomenon depends on contributions from individuals, but cannot be reliably inferred from individual learning.”

Understanding and accepting that “the meaning of learning analytics as a term is plural and multifaceted” is an important basis for future research. Within the only just emerging field of workplace Learning Analytics, not only is there the issue of individual and collaborative learning and knowledge development but also issues around proxies for learning. Whilst performance in practice might be seen as a possible proxy, performance may also be seen to involve a wider range of factors, including the working environment, the division of work and opportunities for practice. And the already established field of Performance Analytics seems at considerable tension to learning.

Double Loop Learning and Learning Analytics

May 4th, 2016 by Graham Attwell

Another in this mini series on Learning Analytics. When looking at Work based learning, Double Loop Learning becomes particularly important. Double-loop learning is used when it is necessary to change the mental model on which a decision depends. Unlike single loops, this model includes a shift in understanding, from simple and static to broader and more dynamic, such as taking into account the changes in the surroundings and the need for expression changes in mental models.(Mildeova, S., Vojtko V. ,2003).

double loop learning

To remind readers again, in the EmployID European project we are aiming to support scalable and cost-effective facilitation of professional identity transformation in public employment services. And I would argue such identity transformation is based on refection on learning, on Double Loop Learning. Identity transformation necessarily involves the development of new metal models and new ways of looking at work based behaviours and practices.

So where does Learning Analytics fit into this? Learning analytics aims to understand and improve learning and the learning environment. This does not necessarily involve Double Loop Learning. For students feedback about their present performance may be enough. But if we aim for identity transformation and wish to improve the learning environment then we need a deeper interpretation of data. This has a number of implications in terms of designing Learning Analytics.

Firstly we have to have a very clear focus on what the purpose of the Learning Analytics is. Is it to find out more for example about informal learning in organisations or to inform L and D department staff about the Learning environment. Is it to help learners understand about their interactions with other staff or to examine their own dispositions for learning – and so on? Secondly – and crucially who is that data presented to users – be it learners or trainers. The existing parading for Learning Analytics presentations appears to be the dashboard. Yet in the LAk16 pre conference workshops there were a whole series of presentations where presenters invited participants to say what the graphics meant. And often we couldn’t. If LA professionals cannot interpret data visualisations then a leaner has little hope of making their own meanings. I am a little puzzled as to why dashboards have become the norm. And one of my major concern is that often it is difficult to understand the visualisation out of the context in which the learning exchange has happened. If Double Loop learning is to happen, then learners need to reflect in order to make meanings. And refection occurs best, I think, in the context in which it takes place.

technical-challenges-for-realizing-learning-analytics-11-638

Image: Ralph Klamma – http://www.slideshare.net/klamma/technical-challenges-for-realizing-learning-analytics

There are alternatives to the dashboard. For instance with EmployID we are developing real time discourse analysis and are also looking at providing dynamic prompts for reflection.>One final point. If we are aiming at using Learning Analytics for Double Loop Learning we need to find out what works and what does not. That means that any measure for Learning Analytics needs to be accompanied by well designed evaluation measures. All too often because LA collects data, it presumes to cover evaluation. Whilst both LA and evaluation may share data, they aim at different things.

  • Search Pontydysgu.org

    News Bites

    MOOC providers in 2016

    According to Class Central a quarter of the new MOOC users  in 2016 came from regional MOOC providers such as  XuetangX (China) and Miríada X (Latin America).

    They list the top five MOOC providers by registered users:

    1. Coursera – 23 million
    2. edX – 10 million
    3. XuetangX – 6 million
    4. FutureLearn – 5.3 million
    5. Udacity – 4 million

    XuetangX burst onto this list making it the only non-English MOOC platform in top five.

    In 2016, 2,600+ new courses (vs. 1800 last year) were announced, taking the total number of courses to 6,850 from over 700 universities.


    Jobs in cyber security

    In a new fact sheet the Tech Partnership reveals that UK cyber workforce has grown by 160% in the five years to 2016. 58,000 people now work in cyber security, up from 22,000 in 2011, and they command an average salary of over £57,000 a year – 15% higher than tech specialists as a whole, and up 7% on last year. Just under half of the cyber workforce is employed in the digital industries, while banking accounts for one in five, and the public sector for 12%.


    Number students outside EU falls in UK

    Times Higher Education reports the number of first-year students from outside the European Union enrolling at UK universities fell by 1 per cent from 2014-15 to 2015-16, according to data released by the Higher Education Statistics Agency.

    Data from the past five years show which countries are sending fewer students to study in the UK.

    Despite a large increase in the number of students enrolling from China, a cohort that has grown by 12,500 since 2011-12, enrolments by students from India fell by 13,150 over the same period.

    Other notable changes include an increase in students from Hong Kong, Singapore and Malaysia and a fall in students from Saudi Arabia and Nigeria.


    Peer Review

    According to the Guardian, research conducted with more than 6,300 authors of journal articles, peer reviewers and journal editors revealed that over two-thirds of researchers who have never peer reviewed a paper would like to. Of that group (drawn from the full range of subject areas) more than 60% said they would like the option to attend a workshop or formal training on peer reviewing. At the same time, over two-thirds of journal editors told the researchers that it is difficult to find reviewers


    Other Pontydysgu Spaces

    • Pontydysgu on the Web

      pbwiki
      Our Wikispace for teaching and learning
      Sounds of the Bazaar Radio LIVE
      Join our Sounds of the Bazaar Facebook goup. Just click on the logo above.

      We will be at Online Educa Berlin 2015. See the info above. The stream URL to play in your application is Stream URL or go to our new stream webpage here SoB Stream Page.

  • Twitter

  • RT @robertowenctr @socialtheoryapp will be presenting today at #learnerjourney event organised by @HolyroodEvents - talking about University and its publics

    Yesterday from Cristina Costa's Twitter via Twitter for iPad

  • Sounds of the Bazaar AudioBoo

  • Recent Posts

  • Archives

  • Meta

  • Upcoming Events

      There are no events.
  • Categories