Archive for the ‘Wales Wide Web’ Category

More thoughts on Workplace Learning Analytics

April 18th, 2017 by Graham Attwell

termination-110301_1920I have been looking at the potential of Learning Analytics (LA) for professional development of employees in European Public Employment services as part of the European funded EmployID project. Despite interest, particularly from Learning and Development personnel within the employment services, Learning Analytics, has made only limited impact and indeed reflects the slow take up of LA in the workplace as a whole.

The reasons for this are myriad. Universities and schools have tended to harvest existing data drawn from Virtual Learning Environments (VLEs) and to analyse that data to both predict individual performance and undertake interventions which can for instance reduce drop-out rates. The use of VLEs in the workplace is limited and “collecting traces that learners leave behind” (Duval, 2012) may fail to take cognizance of the multiple modes of formal and informal learning in the workplace and the importance of key indicators such as collaboration. Ferguson (2012) says that in LA implementation in formal education: “LA is aligned with clear aims and there are agreed proxies for learning.” The most commonly agreed proxy of learning achievement is achievement of outcomes in terms of examinations and assignments. Yet in the workplace, assignment driven learning plays only a limited role, mostly in formal courses and initial vocational education and training.

Workplace learning is driven by demands of work tasks or intrinsic interests of the learner, by self-directed exploration and social exchange that is tightly connected to processes and the places of work (Ley at al, 2015). Learning interactions at the workplace are to a large extent informal and practice based and not embedded into a specific and measurable pedagogical scenario.

In present Learning Analytics developments, there appears to be a tension between measuring and understanding. Pardo and Siemens (2014) say “learners are central agents and collaborators, learner identity and performance are dynamic variables, learning success and performance is complex and multidimensional, data collection and processing needs to be done with total transparency.” This poses particular issues within the workplace with complex social and work structures, hierarchies and power relations.

Despite these difficulties we remain convinced of the potential value of Learning Analytics in the workplace and in Public Employment Service organisations. If used creatively, Learning Analytics can assist learners in monitoring and understanding their own activities and interactions and participation in individual and collaborative learning processes and help them in reflecting on their learning. Furthermore, LA offers a potential approach to gaining rapid feedback to trainers and learning designers and data can be a tool for researchers in gaining a better understanding of learning processes and learning environments.

There is some limited emerging research into Workplace Learning Analytics and Social Learning analytics which offer at least pointers towards developing on such potential. Social Learning Analytics (SLA) can be usefully thought of as a subset of learning analytics approaches. SLA focuses on how learners build knowledge together in their cultural and social settings, taking into account both formal and informal learning environments, including networks and communities. Buckingham Shum, S., & Ferguson, R., (2012) suggest social network analysis focusing on interpersonal relations in social platforms, discourse analytics predicated on the use of language as a tool for knowledge negotiation and construction, content analytics particularly looking at user-generated content and disposition analytics can be developed to make sense of learning in a social setting.

Such an approach to Social Learning Analytics links to the core aims of the EmployID project to support and facilitate the learning process of PES practitioners in their professional identity development by the efficient use of technologies to provide social learning including advanced coaching, reflection, networking and learning support services. The project focuses on technological developments that make facilitation services for professional identity transformation cost-effective and sustainable by empowering individuals and organisations to engage in transformative practices, using a variety of learning and facilitation processes.

It should also be noted that although Learning Analytics has been linked to the collection and analysis of ‘big data’, MacNeill (2016) stresses the importance of fast data, actionable data, relevant data and smart data. LA, she says, should start from research questions that arise from teaching practice, as opposed to the more common approach of starting analytics based on already collected and available data.

Learning Analytics has been the subject on ongoing discussion in the EmployID project and particularly with the PES organisations. Although a number of PES organisations are interested in the possibility of adopting LA, it is not a major priority for them at present and they are aware of the constraints outlined above. Our initial experiences with sentiment analysis confirm this general interest as well as its limitations with public organisations. It has also became apparent that there are major overlaps between the Social Analytics approach and the tools and approaches we have been developing for evaluation. Our work in evaluation encompasses looking at interpersonal relations in social platforms, discourse analytics based on the EmployID MOOCs as well as learners own mapping of their progress through the self-assessment questionnaire.

We recognise that this data can be valuable for PES employees in supporting reflection on learning. But rather than seeking to develop a separate dashboard for reporting on data, we are attempting to embed representations of learning within the context in which the learning takes place. Thus, the social platform allows users to visualise their different interactions through the platform. Other work, such the facilitation coding scheme, does not yet allow real time analytics. But if proven successful as a research approach to understanding and supporting learning, then it could potentially be automated or semi-automated to provide such real time feedback.

The open in MOOC must include the ability to create courses

March 14th, 2017 by Graham Attwell

However you view MOOCs, they have been a success in moving towards open education and in allowing thousands of people not enrolled in formal education programmes to take part in courses.

But in all the talk about open and MOOCs one issue worries me: access to platforms. Yes the best MOOCs and the better platforms encourage conversation between learners and even promote the idea of learners being facilitators. Yet the ability to create a MOOC is largely confined those in a commercial company or those in mainly Higher Education establishments. Increasingly MOOC platforms are only accessible to those who are part of one or another of the consortia which have emerged between different education institutions or those with money to pay into a private MOOC provider. OK, it is possible to hack a MOOC platform together with WordPress or to install Open edX. But it isn’t simple. The Emma project and platform have opened up possibilities to host MOOCs in Europe but I am not sure that this will continue to be supported after their EU funding runs out.

If we want truly open education, then we need to open up opportunities for creating and facilitating learning as well as participating in a programme. I still like Ivan Illich’s 1971 dream in Deschooling Society of a big computer which could send postcards to match those wanting to learn something with those willing to support them. And I see an open MOOC infrastructure as the way we might achieve this. Of course there are concerns over quality. but surely we can find ways of peer reviewing proposed courses and supporting course creators to achieve not only high quality but truly imaginative pedagogy approaches to learning through a MOOC. Quality is not just predicated on the cost of the video production.

I wonder if rather than the formation of big consortia, more democratic federation could be the way to go. It is disappointing to see that FutureLearn has announced that those students who fail to pay a fee (or as they put it, an ‘upgrade’ will no longer be able to access content following the end of a course. This is just one more reason why we need an open MOOC infrastructure or ecology if MOOCs are to be truly open.

More about competency based education

March 13th, 2017 by Graham Attwell

Just a quick addendum to my earlier article on competency based education. Things go round and much of my experience was from providing professional development to teachers and trainers for the UK around the introduction of the competency based National Vocational Qualifications (NVQs)in the early 1990s.  At the same time the National Council for Vocational Qualifications, charged with the development of the NVQs established a development group which I was a member of and which held lengthy discussions over issues which arose in trialling and evaluation of the new qualifications. Some of the issues I talked about in the last article including the particularly contentious discussions around knowledge.

Another issue which caused much debate was that of level. In the past longer courses tended to have a higher level, but since competence based qualifications were supposed to replace the updated time serving involved in earning, this was a non-starter. Some argued that it was illogical to even try to prescribe level to a competence – either someone was competent or they were not. But political pressures meant there must be levels so the pressure then was to find any consistent way of doing it. It is some time since I last looked ta UK vocational qualifications and I do not know how the levels are presently being determined. But back in the 1990s it was essentially determined by the degree of autonomy or responsibility that someone held in a job. If they mostly followed instructions then their competence was at a low level, if they were responsible for managing others their level of competence was much higher. Of course, this led to all kinds of strange anomalies which were in the best traditions of UK pragmatism juggled with until the level of the qualification felt about right.

I will try and find some of the longer papers I wrote at the time which may still have some relevance for current debates over competency based education.

 

 

What’s the problem with competency based education?

March 8th, 2017 by Graham Attwell

competencybasedlearning

I got this diagram from a report by Katherine Hockey on the Accelerating Digital Transformation in Higher Education conference, I’ll write more about some of the topics Katherine raises, but for the moment just want to focus on Competency Based Learning.

Katherine says “Higher education learning may be open to change regarding teaching methods. Competency-based learning is being implemented at UEL, whereby the structure of a course is not linear in the traditional sense: the learner chooses modules at an order and pace that suits them. This aims at increasing employability, and was met initially with reservations but soon became popular with academics.”

Lets take the first sentence first. It seems to me that there is no doubt that Higher Education in general is open to new teaching methods. There may have been in the past resistance to using technology in education – partly based on a lack f competence and confidence in using technology as part of teaching and learning – but there have always been islands of exciting experimentation and innovation. The question has been how to move out from the islands.

But it is a big jump to equate openness to change with competency based education. Competency based education itself is hardly new – in the early 1990s the UK reformed its Vocational Education and Training provision to move to competency based qualifications under the National Council for Vocational Qualifications (NCVQ). It was not an unqualified success. And there are disturbing parallels between what NCVQ said at the time and the University of East London’s diagram.

Firstly is the myth that employers always know best. Just why a qualification developed with employers should be valid, and one developed without employers not is beyond me. The problem with employers is that they tend to look to the present or the short term future in defining skills requirements. Nd there is a difference between the skills that individual employers may require – or even groups of employers – and the wider knowledge and skills required to be flexible and forward looking in employment today. But even then would this be con

Another problem that beset NVQs was the relationship between ‘competence’ and knowledge and how to define performance to meet such competence. The NVQ system evolved, starting out with bald functional competence statements (yes, developed with employers), but later including ‘performance’ criteria and ‘underpinning knowledge.’ But even then was achievement of these standards considered as ‘mastery’? Some argued that it would be necessary to define the context in which the skills should be evidenced, others that there should be frequent (although how many and how often was never agreed) demonstrations of performance. And then of course there was the question of who is qualified to recognise the University of East London student’s mastery? How is there competence to act as assessors to be defined and assessed?

One of the big arguments for National Vocational Qualifications was the need to move away from time serving and have personalised and flexible routes whereby individuals could choose what they wanted to learn. In fact, some at NCVQ went further arguing that learning as an activity should be separated from qualifications. Once more few went down this route. Courses continued to be the way to qualifications, although there were a number of (quite expensive) experiments with recognising prior competences.

I would be deeply suspicious of just what they mean by “tuition model is subscription based”? This seems like just another attempt to package up education for sale in nice chunks: a step forward in the privatisation of education. But if past experiences of competency evangelism are anything to go by, this one will fail.

Constructing learning

March 7th, 2017 by Graham Attwell

 

 

 

 

 

 

 

 

 

 

 

 

Interesting report in the Jisc email. They say:

“Blended learning (the merging of technology and face-to-face) involves learners in the construction of their own learning. But a recent survey by Sheffield Hallam University showed that there’s inconsistency in learners’ experiences of this – a concern likely shared across the country.

Students also said that they expected the majority of their learning to be supported by an online platform. As a result, Sheffield Hallam University has created a set of “minimum expectations” for their teaching staff to encourage them to publish learning resources online, give online assessment feedback and use social media for student-staff collaboration.”

Without having read the full report from Sheffield, I wonder how much learners on blended learning programmes really are involved in the construction of their own learning and how they are supported in that process. It is also interesting to see the university turning to social media for student staff collaboration. Guess I need to read the report!

 

 

Interpreting and presenting data

March 1st, 2017 by Graham Attwell

I have been working on the contents for week 4 of the free #EmployID MOOC on The Changing World of Work, taking place on the European EMMA platform and starting in late March. Week 4 is all about Labour Market Information – or as I prefer to call it, Labour Market Intelligence – and how we can use labour market data both for job seekers and young people choosing careers and by advisers and other professionals working in the careers and labour market domain.

One of the major challenges is how to represent data. This presentation, Data is beautiful: Techniques, tools and apps for sharing your results by Laura Ennis, provides some good practical advice on how to present data. It come from a talk she did at Leap Into Research 2017.

Making Multimedia for MOOCs

February 22nd, 2017 by Graham Attwell


I’ve been bogged down for the past two weeks writing reports and trying to catch up on a dreadful backlog of work. But that’s another story.

Amongst other things, this week I am producing content for the European EmployID MOOC on the ‘Changing World of Work.‘ As the blurb says:

Do you want to be prepared for the challenges of the changing labour market?

Do you want to better understand and apply skills related to emotional awareness, active listening, reflection, coaching skills, peer coaching and powerful questioning?

Do you want to explore tools for handling Labour Market Information (LMI) and the digital agenda?

This course has been devised as part of the European EmployID project, for Public Employment Services (PES) practitioners and careers professionals. Our 5 lessons will run over a period of 6 weeks with an estimated workload of 3.5 hours per week; the total workload is expected to be 17.5 hours.

I am producing the content for week 5, on Labour Market Information. Its not by any means the first content I have written for on-line courses, but I still feel I am learning.

I find it quite hard to gauge how much content to produce and how long it will take to work through it. I also find it hard switching from writing academic stuff and reports to writing course material and getting the language register right.

One thing I am trying to do, is add more multi media content. The big issue here is work flow and production. I am pretty happy with the video above. OK it only lasts one and a half minutes but I managed to make it from scratch in about two hours.

I made it using the Apple keynote presentation software. All the images come from the brilliant Pixabay website and are in the public domain. And then it was just a question of adding the audio which can now be done inside Keynote, exporting to video and uploading to YouTube. I am planning to make two or three more videos as part of the course. It is much faster than editing video and still produces a reasonabel result I think.

Technology is only useful if it involves no extra effort!

February 7th, 2017 by Graham Attwell

ComputerNancy Dixon has published an interesting review of a study entitled “To Share Or Not To Share: An Exploratory Review Of Knowledge Management Systems And A.Knowledge Sharing in Multinational Corporations” (for full reference see below).

The authors define knowledge sharing as “the movement of knowledge between different individuals, departments, divisions, units or branches in Multinational Corprorations through Knowledge Management Systems (KMSs)” and study was based on semi-structured interviews with 42 participants across 32 organizations in 12 countries.

Nancy Dixon says one of the main findings of the study was that the acceptance of technology for knowledge sharing is directly related to how employees view the usefulness of the technology in supporting their job performance, without extra effort. Interviewees said they are more likely to use their KMS if it is similar to the tools they already use at home, e.g. Facebook, Twitter, YouTube, Wikipedia.

Part of the work we have been doing in the EU Learning Layers project has been developing and evaluating tools for informal learning in Small and Medium Enterprises. Our findings are similar in that tools should take no extra effort. One reason may simply be speed up and pressure in the work process, particularly in the National Health Service in the UK. Another may be lack of familiarity and confidence in the use of technology based tools, especially tools for collaboration. Although most jobs today require some form of collaboration, much of that still happens through face to face contact or by email. The move to collaborative tools for knowledge sharing is non trivial.

The findings of the study and of our own work pose particular problems for research, design and development. I remain wedded to the idea that co-design processes are critical to design and develop tools to support informal learning and knowledge sharing in teh workplace. Yet at the same time, iterative design processes will be problematic if employees are unwilling or unable to rethink work processes.

Another finding from the Knowledge Management study was that interviewees said they are more likely to use their Knowledge Management System if it is similar to the tools they already use at home, e.g. Facebook, Twitter, YouTube, Wikipedia. While people may have said this I think it requires a little interpretation. Instead of similar, I suspect that people are referring to ease of use and to design motifs. Of course software changes. The interface to Slack is very different to that of collaborative software platforms that came before. And Facebook has undergone numerous redesigns.  But one of the big problems for relatively modestly funded research and development projects in learning and in knowledge management is that we tend not to worry too much about interface design. That is always something that can be done later. But users do worry about the interlace and about appearance and ease of use.

I increasingly suspect the acceptance, adoption and use of new (innovative) tools for learning and knowledge management rest with processes of digital transformation in organisations. Only when the tools themselves are linked to changing practices (individual and collective) will their be substantial uptake.

Abdelrahman, M., Papamichail, K. N., & Wood-Harper, T. (2016). To Share Or Not To Share: An Exploratory Review Of Knowledge Management Systems And Knowledge Sharing in Multinational Corporations. In: UK Academy for information systems (UKAIS) 21st Annual Conference – 2016, 11-13 April 2016, Oxford

 

The unwritten rules of engagement

January 16th, 2017 by Graham Attwell

Fascinating research from April Yee, program officer for the James Irvine Foundation in the USA. In a report entitled “The Unwritten Rules of Engagement: Social Class Differences in Undergraduates’ Academic Strategies” and reported in Times Higher, Yee says even when students from socio-economically disadvantaged backgrounds are able to access higher education they face further challenges that their more privileged counterparts do not. This she believes is due toe different learning strategies. Whilst the learning strategies of middle class students are recognised by the institutions, the strategies of first generation working class students are not.

“First-generation students believe that they are responsible for earning good grades on their own,” she writes.

“First-generation students employ engagement strategies that emphasise independence while middle-class students…emphasise interaction, in addition to independence. Thus middle-class students are more likely to achieve not because they exert more absolute effort, but because they employ a wider range of strategies.”

She adds that the research, published in the Journal of Higher Education, “points to the role of institutions in defining the implicit rules of engagement, such that middle-class strategies of interaction are recognised and rewarded while first-generation strategies of independence are largely ignored”.

Of course all this leaves more questions than it answers (and is why people should read full reports, rather than rely on the Times Higher digest). I am interested in just what is an engagement strategy that emphasises interaction. To what degree can the design of student assignments, for instance with groupwork, support interaction – if indeed such a learning strategy needs to be supported. And if this research holds true for universities what might it mean for the schools sector.

Learning Analytics and the Peak of Inflated Expectations

January 15th, 2017 by Graham Attwell

hypecycleHas Learning Analytics dropped of the peak of inflated expectations in Gartner’s hype cycle?  According to Educause ‘Understanding the power of data’ is still there as a major trend in higher education and Ed Tech reports a KPMG survey which found that 41 percent of universities were using data for forecasting and predictive analytics.

But whilst many universities are exploring how data can be used to improve retention and prevent drop outs, there seems little pretence any more that Learning Analytics has much to do with learning. The power of data has somehow got muddled up with Management Analytics, Performance Analytics and all kinds of other analytics – but the learning seems to have been lost. Data mining is great but it needs a perspective on just what we are trying to find out.

I don’t think Learning analytics will go into the trough of despair. But i think that there are very real problems in working out how best we can use data – and particularly how we can use  data to support learning. Learning analytics need to be more solidly grounded in what is already known about teaching and learning. Stakeholders, including teachers, learners and the wider community, need to be involved in the development and implementation of learning analytics tools. Overall, more evidence is needed to show which approaches work in practice and which do not.

Finally, we already know a great deal about formal learning in institutions, or at least by now we should do. Of course we need to work at making it better. But we know far less about informal learning and learning which takes place in everyday living and working environments. And that is where I ultimately see Learning analytics making a big difference. Learning Analytics could potentially help us all to self directed learners and to achieve the learning goals that we set ourselves. But that is a long way off. Perhaps if Learning analytics is falling off the peak of expectations that will provide the space for longer term more clearly focused research and development.

 

  • Search Pontydysgu.org

    News Bites

    MOOC providers in 2016

    According to Class Central a quarter of the new MOOC users  in 2016 came from regional MOOC providers such as  XuetangX (China) and Miríada X (Latin America).

    They list the top five MOOC providers by registered users:

    1. Coursera – 23 million
    2. edX – 10 million
    3. XuetangX – 6 million
    4. FutureLearn – 5.3 million
    5. Udacity – 4 million

    XuetangX burst onto this list making it the only non-English MOOC platform in top five.

    In 2016, 2,600+ new courses (vs. 1800 last year) were announced, taking the total number of courses to 6,850 from over 700 universities.


    Jobs in cyber security

    In a new fact sheet the Tech Partnership reveals that UK cyber workforce has grown by 160% in the five years to 2016. 58,000 people now work in cyber security, up from 22,000 in 2011, and they command an average salary of over £57,000 a year – 15% higher than tech specialists as a whole, and up 7% on last year. Just under half of the cyber workforce is employed in the digital industries, while banking accounts for one in five, and the public sector for 12%.


    Number students outside EU falls in UK

    Times Higher Education reports the number of first-year students from outside the European Union enrolling at UK universities fell by 1 per cent from 2014-15 to 2015-16, according to data released by the Higher Education Statistics Agency.

    Data from the past five years show which countries are sending fewer students to study in the UK.

    Despite a large increase in the number of students enrolling from China, a cohort that has grown by 12,500 since 2011-12, enrolments by students from India fell by 13,150 over the same period.

    Other notable changes include an increase in students from Hong Kong, Singapore and Malaysia and a fall in students from Saudi Arabia and Nigeria.


    Peer Review

    According to the Guardian, research conducted with more than 6,300 authors of journal articles, peer reviewers and journal editors revealed that over two-thirds of researchers who have never peer reviewed a paper would like to. Of that group (drawn from the full range of subject areas) more than 60% said they would like the option to attend a workshop or formal training on peer reviewing. At the same time, over two-thirds of journal editors told the researchers that it is difficult to find reviewers


    Other Pontydysgu Spaces

    • Pontydysgu on the Web

      pbwiki
      Our Wikispace for teaching and learning
      Sounds of the Bazaar Radio LIVE
      Join our Sounds of the Bazaar Facebook goup. Just click on the logo above.

      We will be at Online Educa Berlin 2015. See the info above. The stream URL to play in your application is Stream URL or go to our new stream webpage here SoB Stream Page.

  • Twitter

  • RT @zephoria “Ten simple rules for responsible big data research” is a new PLoS paper written by a bunch of us: journals.plos.org/ploscompbio…

    About 7 hours ago from Cristina Costa's Twitter via Twitter for iPad

  • Sounds of the Bazaar AudioBoo

  • Recent Posts

  • Archives

  • Meta

  • Upcoming Events

      There are no events.
  • Categories