Archive for the ‘learning Analytics’ Category

Black Box Learning Analytics? Beyond Algorithmic Transparency

June 14th, 2017 by Graham Attwell

I guess we are all a bored with PowerPoint presentations these days. But when done well, presnetations can be brilliant for questioning what we are doing and should be doing. What are Algorithms? asks Simon Buckingham Shum, Professor of Learning Informatics / Director, Connected Intelligence Centre UTS. According to Paul Dourish in The Social Lives of Algorithm they are abstract rules for transforming data which to exert influence require programming as executable code operating on data structures running on a platform in an increasingly distributed architecture. Simon goes on to question the intentional secrecy technical illiteracy complexity of infrastructure that make algorithms opaque and looks at their growing impact in education.

 

More thoughts on Workplace Learning Analytics

April 18th, 2017 by Graham Attwell

termination-110301_1920I have been looking at the potential of Learning Analytics (LA) for professional development of employees in European Public Employment services as part of the European funded EmployID project. Despite interest, particularly from Learning and Development personnel within the employment services, Learning Analytics, has made only limited impact and indeed reflects the slow take up of LA in the workplace as a whole.

The reasons for this are myriad. Universities and schools have tended to harvest existing data drawn from Virtual Learning Environments (VLEs) and to analyse that data to both predict individual performance and undertake interventions which can for instance reduce drop-out rates. The use of VLEs in the workplace is limited and “collecting traces that learners leave behind” (Duval, 2012) may fail to take cognizance of the multiple modes of formal and informal learning in the workplace and the importance of key indicators such as collaboration. Ferguson (2012) says that in LA implementation in formal education: “LA is aligned with clear aims and there are agreed proxies for learning.” The most commonly agreed proxy of learning achievement is achievement of outcomes in terms of examinations and assignments. Yet in the workplace, assignment driven learning plays only a limited role, mostly in formal courses and initial vocational education and training.

Workplace learning is driven by demands of work tasks or intrinsic interests of the learner, by self-directed exploration and social exchange that is tightly connected to processes and the places of work (Ley at al, 2015). Learning interactions at the workplace are to a large extent informal and practice based and not embedded into a specific and measurable pedagogical scenario.

In present Learning Analytics developments, there appears to be a tension between measuring and understanding. Pardo and Siemens (2014) say “learners are central agents and collaborators, learner identity and performance are dynamic variables, learning success and performance is complex and multidimensional, data collection and processing needs to be done with total transparency.” This poses particular issues within the workplace with complex social and work structures, hierarchies and power relations.

Despite these difficulties we remain convinced of the potential value of Learning Analytics in the workplace and in Public Employment Service organisations. If used creatively, Learning Analytics can assist learners in monitoring and understanding their own activities and interactions and participation in individual and collaborative learning processes and help them in reflecting on their learning. Furthermore, LA offers a potential approach to gaining rapid feedback to trainers and learning designers and data can be a tool for researchers in gaining a better understanding of learning processes and learning environments.

There is some limited emerging research into Workplace Learning Analytics and Social Learning analytics which offer at least pointers towards developing on such potential. Social Learning Analytics (SLA) can be usefully thought of as a subset of learning analytics approaches. SLA focuses on how learners build knowledge together in their cultural and social settings, taking into account both formal and informal learning environments, including networks and communities. Buckingham Shum, S., & Ferguson, R., (2012) suggest social network analysis focusing on interpersonal relations in social platforms, discourse analytics predicated on the use of language as a tool for knowledge negotiation and construction, content analytics particularly looking at user-generated content and disposition analytics can be developed to make sense of learning in a social setting.

Such an approach to Social Learning Analytics links to the core aims of the EmployID project to support and facilitate the learning process of PES practitioners in their professional identity development by the efficient use of technologies to provide social learning including advanced coaching, reflection, networking and learning support services. The project focuses on technological developments that make facilitation services for professional identity transformation cost-effective and sustainable by empowering individuals and organisations to engage in transformative practices, using a variety of learning and facilitation processes.

It should also be noted that although Learning Analytics has been linked to the collection and analysis of ‘big data’, MacNeill (2016) stresses the importance of fast data, actionable data, relevant data and smart data. LA, she says, should start from research questions that arise from teaching practice, as opposed to the more common approach of starting analytics based on already collected and available data.

Learning Analytics has been the subject on ongoing discussion in the EmployID project and particularly with the PES organisations. Although a number of PES organisations are interested in the possibility of adopting LA, it is not a major priority for them at present and they are aware of the constraints outlined above. Our initial experiences with sentiment analysis confirm this general interest as well as its limitations with public organisations. It has also became apparent that there are major overlaps between the Social Analytics approach and the tools and approaches we have been developing for evaluation. Our work in evaluation encompasses looking at interpersonal relations in social platforms, discourse analytics based on the EmployID MOOCs as well as learners own mapping of their progress through the self-assessment questionnaire.

We recognise that this data can be valuable for PES employees in supporting reflection on learning. But rather than seeking to develop a separate dashboard for reporting on data, we are attempting to embed representations of learning within the context in which the learning takes place. Thus, the social platform allows users to visualise their different interactions through the platform. Other work, such the facilitation coding scheme, does not yet allow real time analytics. But if proven successful as a research approach to understanding and supporting learning, then it could potentially be automated or semi-automated to provide such real time feedback.

Machines against humans?

March 17th, 2017 by Graham Attwell

I expect to see more of this debate in the future. Richard Palmer (Tribal) and Sheila MacNeil (Glasgow Caledonian University) had a debate at the Jisc Digifest17 about whether learning analytics interventions should always be mediated by a human being. Richard (for machines) and Sheila (for humans), speak about their thoughts on the topic in the Digifest studio with Robert and Louisa.

Learning Analytics and the Peak of Inflated Expectations

January 15th, 2017 by Graham Attwell

hypecycleHas Learning Analytics dropped of the peak of inflated expectations in Gartner’s hype cycle?  According to Educause ‘Understanding the power of data’ is still there as a major trend in higher education and Ed Tech reports a KPMG survey which found that 41 percent of universities were using data for forecasting and predictive analytics.

But whilst many universities are exploring how data can be used to improve retention and prevent drop outs, there seems little pretence any more that Learning Analytics has much to do with learning. The power of data has somehow got muddled up with Management Analytics, Performance Analytics and all kinds of other analytics – but the learning seems to have been lost. Data mining is great but it needs a perspective on just what we are trying to find out.

I don’t think Learning analytics will go into the trough of despair. But i think that there are very real problems in working out how best we can use data – and particularly how we can use  data to support learning. Learning analytics need to be more solidly grounded in what is already known about teaching and learning. Stakeholders, including teachers, learners and the wider community, need to be involved in the development and implementation of learning analytics tools. Overall, more evidence is needed to show which approaches work in practice and which do not.

Finally, we already know a great deal about formal learning in institutions, or at least by now we should do. Of course we need to work at making it better. But we know far less about informal learning and learning which takes place in everyday living and working environments. And that is where I ultimately see Learning analytics making a big difference. Learning Analytics could potentially help us all to self directed learners and to achieve the learning goals that we set ourselves. But that is a long way off. Perhaps if Learning analytics is falling off the peak of expectations that will provide the space for longer term more clearly focused research and development.

 

Learning Analytics for Workplace and Professional Learning

September 19th, 2016 by Graham Attwell

There is a small but growing community emerging in Europe around the potential and use of Learning analytics in the workplace. And there are still places available for a free workshop entitled ‘Learning Analytics for Workplace and Professional Learning’ on Friday 23 September in Leeds in the UK.

The workshop is being organised around two main activities.During the morning all the participants will shortly present their research work or the research questions they are working on. Thus, we can find common problems, synergies and potential collaborations in future projects. During the afternoon we can work on smaller groups to discuss some community building mechanisms: follow-up workshops (maybe at LAK’17), potential grant applications, creation of an on-line community, collaboration in current or future projects, collaboration with other researchers or research communities.

More information can be found in this PDF document. And to register, please send an email to: Tamsin Treasure-Jones (t [dot] treasure-jones [at] leeds [dot] ac [dot] uk)

Workplace Learning Analytics workshop

September 7th, 2016 by Graham Attwell

It is not easy developing a community around Workplace Learning Analytics but there are some signs of emerging interest.

On 23 September 2016, in Leeds, there is an open workshop on Workplace Learning Analytics, sponsored by the Learning Layers project. The invitation to the workshop runs as follows.

Up to now, the Learning Analytics community has not paid much attention to workplace and professional scenarios, where, in contrast to formal education, learning is driven by demands of work tasks and intrinsic interest of the learner. Hence, learning processes at the workplace are typically informal and not related to a pedagogical scenario. But workplace and professional learners can benefit from their own and others’ learning outcomes as they will potentially increase learning awareness, knowledge sharing and scaling-up learning practices.

Some approaches to Learning Analytics at the workplace have been suggested. On the one hand, the topic of analytics in smart industries has extended its focus to some learning scenarios, such as how professionals adopt innovations or changes at the workplace; on the other hand, the Learning Analytics community is increasing its attention to informal scenarios, professional training courses and teaching analytics. For this reason, we consider that it a great moment to collect all this interest to build up a community related to workplace and professional Learning Analytics.

  • We invite you to take part in the workshop which will have two main activities:
    During the morning all the participants will shortly present their research work or the research questions they are working on. Thus, we can find common problems, synergies and potential collaborations in future projects.
  • During the afternoon we can work on smaller groups to discuss some community-building mechanisms: follow-up workshops (maybe at LAK’17), potential grant applications, creation of an on-line community, collaboration in current or future projects, collaboration with other researchers or research communities… etc.

To register, please send an email to: Tamsin Treasure-Jones (t [dot] treasure-jones [at] leeds [dot] ac [dot] uk)

Intersections or Contradictions?

September 7th, 2016 by Graham Attwell

I like this presentation by Paul Prinsloo in that it draws out the differing motivations and pressures for developing Learning Analytics. But I wonder if Learning Analytics is at the intersections of thse different pressures – or rather if it exposes the contradictions facing the future of education today>

Learning Analytics for Vocational Education and Training

July 25th, 2016 by Graham Attwell

I have just spent an hour or so on a periodic search for research and development about Workplace Learning Analytics and the use of Learning Analytics in the public sector. As usual the results are pretty thin – although I think the slowly growing interest in the links between learning design and Learning Analytics may come in handy in the future.

The one thing which did interest me was a report of a workshop organised by Jisc with Greater Manchester Chamber of Commerce “to explore significant opportunities for improving access to data, and analytical capacity, in the face of the significant changes that are taking place across further education and training, skills commissioning and apprenticeship provision and funding.”

Of course, some of this is very UK specific in terms of the commissioning and funding models and are focused particularly at Further Education institutions. But some of the approaches  would appear more transferable for work based learning in general.

As part of the Further Education initiative, Jisc have been developing a series of user stories. The breadth and depth of the stories were extended at the workshop. Paul Bailey from the Jisc Learning Analytics project explains: “The user stories that were prioritised were around using analytics to

  • help learners to improve retention, achievement of grades and make informed decisions regarding their next destinations
  • improve the quality of learning and teaching, including looking at the curriculum design and use of rich content in online learning
  • improve college support processes to improve retention and provide effective careers service to support progression
  • understand the employer demand to better plan curriculum and recruitment
  • track finance and quality to remain competitive.”

In general these priorities apply across the initial vocational and education training sector (particularly for apprenticeships). However they don’t really work for public sector organisations which are largely focused on continuing training and professional development and which have different institutional and organisations aims and purposes than vocational colleges. But I like the story telling approach which could be a good way of exploring the potential of Learning Analytics in these organisations.

New book: Empowering Change in European Public Employment Services

July 18th, 2016 by Graham Attwell

employid bookReaders familiar with European Research projects will know how they work. The projects negotiate with the European Commission a DOW – Description of Work – which details the work to be undertaken in each year of the project. It is divided into discrete work packages. Every year the work package provides a (usually over lengthy) report on research and development undertaken which is then presented to a team of expert reviewers who can ‘pass’, recommend changes or ‘fail’ the report. Although obviously large scale multi national research projects need structures and plans. But all too often the work package structure separates research and development activities which should not be separated and the DOW become a restrictive ‘bible’, rather than a guide for action. And despite the large amount of work which goes into preparing the work package reports, they are seldom widely read (if indeed read at all), except by the reviewers.

In the EmployID project which is working with identity transformation in European Public Employment Services (PES), we are doing things differently. The work is structured though cross work package teams, who follow an adapted SCRUM structure. The teams are reviewed at face to face meetings and recomposed if necessary. And this year, instead of producing a series of Work package reports, the project partners have jointly contributed to a book – Empowering Change in Public Employment Services: The EmployID Approach which has just been published and can be downloaded for free.

The introduction to the 244 page PDF book explains the background to the work:

European Public Employment Services (PES) and their employees are facing fundamental challenges to the delivery of efficient and effective services and the need to change their strategies to combat high unemployment, demographic change in increasingly uncertain and dynamic labour markets. This does not only require developing new professional skills related to new tasks, but poses for more profound developmental challenges for staff members.

Three of these changes relate to understanding the changing world of work; a ‘turn’ towards coaching; and the increased importance of relations with employers. The staff need to learn new ways of working, with a major challenge being to enhance the power of collaborative (peer) learning in order to support staff in accomplishing their goals.

All these changes are linked to transforming professional identity, which requires learning on a deeper level that is often neglected by continuing professional development strategies. EmployID makes its contribution here; that PES practitioners’ learning related to professional identity transformation needs to be facilitated through social learning approaches and the nurturing of social learning networks, which include the following:

  • Reflection as a way of turning one’s own and others’ experiences into general insights on multiple levels, both from an individual and a collective perspective

  • Peer coaching as a way of helping learners in changing their behavior through a structured process

  • Social learning programmes as a way of engaging learners with new topics, other perspectives, and conversations around it.

Tensions in Learning Analytics

May 27th, 2016 by Graham Attwell

The debate around Learning Analytics seems to be opening up. And although there is little sign of agreement over future directions, the terms of discussion seem both broader and more nuanced than previously. I think some of this is in response to the disillusionment of early researchers and adopters.

In yesterdays OLDaily, Stephen Downes pointed to an excellent article by Bodong Chen. Bodong points to the surge of interest in Learning Analytics but cautions that: “The surge of this nascent field rests on a promise–and also a premise–that digital traces of learning could be turned into actionable knowledge to promote learning and teaching.

He suggests that: “One approach to understanding learning analytics is to recognize what are not learning analytics” including academic analytics and educational data mining. Instead, he says “learning analytics is more directly concerned with teachers and learners by attending to micro-patterns of learning.”

Bodong draws attention to a tension between learning and analytics “as two pivotal concepts of the field” He points out that “learning analytics deals with educational phenomena at multiple levels”. As an example he says: “collaborative knowledge building as a group phenomenon depends on contributions from individuals, but cannot be reliably inferred from individual learning.”

Understanding and accepting that “the meaning of learning analytics as a term is plural and multifaceted” is an important basis for future research. Within the only just emerging field of workplace Learning Analytics, not only is there the issue of individual and collaborative learning and knowledge development but also issues around proxies for learning. Whilst performance in practice might be seen as a possible proxy, performance may also be seen to involve a wider range of factors, including the working environment, the division of work and opportunities for practice. And the already established field of Performance Analytics seems at considerable tension to learning.

  • Search Pontydysgu.org

    Social Media




    News Bites

    Cyborg patented?

    Forbes reports that Microsoft has obtained a patent for a “conversational chatbot of a specific person” created from images, recordings, participation in social networks, emails, letters, etc., coupled with the possible generation of a 2D or 3D model of the person.


    Racial bias in algorithms

    From the UK Open Data Institute’s Week in Data newsletter

    This week, Twitter apologised for racial bias within its image-cropping algorithm. The feature is designed to automatically crop images to highlight focal points – including faces. But, Twitter users discovered that, in practice, white faces were focused on, and black faces were cropped out. And, Twitter isn’t the only platform struggling with its algorithm – YouTube has also announced plans to bring back higher levels of human moderation for removing content, after its AI-centred approach resulted in over-censorship, with videos being removed at far higher rates than with human moderators.


    Gap between rich and poor university students widest for 12 years

    Via The Canary.

    The gap between poor students and their more affluent peers attending university has widened to its largest point for 12 years, according to data published by the Department for Education (DfE).

    Better-off pupils are significantly more likely to go to university than their more disadvantaged peers. And the gap between the two groups – 18.8 percentage points – is the widest it’s been since 2006/07.

    The latest statistics show that 26.3% of pupils eligible for FSMs went on to university in 2018/19, compared with 45.1% of those who did not receive free meals. Only 12.7% of white British males who were eligible for FSMs went to university by the age of 19. The progression rate has fallen slightly for the first time since 2011/12, according to the DfE analysis.


    Quality Training

    From Raconteur. A recent report by global learning consultancy Kineo examined the learning intentions of 8,000 employees across 13 different industries. It found a huge gap between the quality of training offered and the needs of employees. Of those surveyed, 85 per cent said they , with only 16 per cent of employees finding the learning programmes offered by their employers effective.


    Other Pontydysgu Spaces

    • Pontydysgu on the Web

      pbwiki
      Our Wikispace for teaching and learning
      Sounds of the Bazaar Radio LIVE
      Join our Sounds of the Bazaar Facebook goup. Just click on the logo above.

      We will be at Online Educa Berlin 2015. See the info above. The stream URL to play in your application is Stream URL or go to our new stream webpage here SoB Stream Page.

  • Twitter

  • Recent Posts

  • Archives

  • Meta

  • Categories