Introduction

    Welcome to the Wales Wide Web

    October 25th, 2007 by Dirk Stieglitz

    Wales Wide Web is Graham Attwell’s main blog. Graham Attwell is Director of the Wales based research organisation, Pontydysgu. The blog covers issues like open-source, open-content, open-standards, e-learning and Werder Bremen football team.

    You can reach Graham by email at graham10 [at] mac [dot] com

    Wales Wide Web

    More thoughts on Workplace Learning Analytics

    April 18th, 2017 by Graham Attwell

    termination-110301_1920I have been looking at the potential of Learning Analytics (LA) for professional development of employees in European Public Employment services as part of the European funded EmployID project. Despite interest, particularly from Learning and Development personnel within the employment services, Learning Analytics, has made only limited impact and indeed reflects the slow take up of LA in the workplace as a whole.

    The reasons for this are myriad. Universities and schools have tended to harvest existing data drawn from Virtual Learning Environments (VLEs) and to analyse that data to both predict individual performance and undertake interventions which can for instance reduce drop-out rates. The use of VLEs in the workplace is limited and “collecting traces that learners leave behind” (Duval, 2012) may fail to take cognizance of the multiple modes of formal and informal learning in the workplace and the importance of key indicators such as collaboration. Ferguson (2012) says that in LA implementation in formal education: “LA is aligned with clear aims and there are agreed proxies for learning.” The most commonly agreed proxy of learning achievement is achievement of outcomes in terms of examinations and assignments. Yet in the workplace, assignment driven learning plays only a limited role, mostly in formal courses and initial vocational education and training.

    Workplace learning is driven by demands of work tasks or intrinsic interests of the learner, by self-directed exploration and social exchange that is tightly connected to processes and the places of work (Ley at al, 2015). Learning interactions at the workplace are to a large extent informal and practice based and not embedded into a specific and measurable pedagogical scenario.

    In present Learning Analytics developments, there appears to be a tension between measuring and understanding. Pardo and Siemens (2014) say “learners are central agents and collaborators, learner identity and performance are dynamic variables, learning success and performance is complex and multidimensional, data collection and processing needs to be done with total transparency.” This poses particular issues within the workplace with complex social and work structures, hierarchies and power relations.

    Despite these difficulties we remain convinced of the potential value of Learning Analytics in the workplace and in Public Employment Service organisations. If used creatively, Learning Analytics can assist learners in monitoring and understanding their own activities and interactions and participation in individual and collaborative learning processes and help them in reflecting on their learning. Furthermore, LA offers a potential approach to gaining rapid feedback to trainers and learning designers and data can be a tool for researchers in gaining a better understanding of learning processes and learning environments.

    There is some limited emerging research into Workplace Learning Analytics and Social Learning analytics which offer at least pointers towards developing on such potential. Social Learning Analytics (SLA) can be usefully thought of as a subset of learning analytics approaches. SLA focuses on how learners build knowledge together in their cultural and social settings, taking into account both formal and informal learning environments, including networks and communities. Buckingham Shum, S., & Ferguson, R., (2012) suggest social network analysis focusing on interpersonal relations in social platforms, discourse analytics predicated on the use of language as a tool for knowledge negotiation and construction, content analytics particularly looking at user-generated content and disposition analytics can be developed to make sense of learning in a social setting.

    Such an approach to Social Learning Analytics links to the core aims of the EmployID project to support and facilitate the learning process of PES practitioners in their professional identity development by the efficient use of technologies to provide social learning including advanced coaching, reflection, networking and learning support services. The project focuses on technological developments that make facilitation services for professional identity transformation cost-effective and sustainable by empowering individuals and organisations to engage in transformative practices, using a variety of learning and facilitation processes.

    It should also be noted that although Learning Analytics has been linked to the collection and analysis of ‘big data’, MacNeill (2016) stresses the importance of fast data, actionable data, relevant data and smart data. LA, she says, should start from research questions that arise from teaching practice, as opposed to the more common approach of starting analytics based on already collected and available data.

    Learning Analytics has been the subject on ongoing discussion in the EmployID project and particularly with the PES organisations. Although a number of PES organisations are interested in the possibility of adopting LA, it is not a major priority for them at present and they are aware of the constraints outlined above. Our initial experiences with sentiment analysis confirm this general interest as well as its limitations with public organisations. It has also became apparent that there are major overlaps between the Social Analytics approach and the tools and approaches we have been developing for evaluation. Our work in evaluation encompasses looking at interpersonal relations in social platforms, discourse analytics based on the EmployID MOOCs as well as learners own mapping of their progress through the self-assessment questionnaire.

    We recognise that this data can be valuable for PES employees in supporting reflection on learning. But rather than seeking to develop a separate dashboard for reporting on data, we are attempting to embed representations of learning within the context in which the learning takes place. Thus, the social platform allows users to visualise their different interactions through the platform. Other work, such the facilitation coding scheme, does not yet allow real time analytics. But if proven successful as a research approach to understanding and supporting learning, then it could potentially be automated or semi-automated to provide such real time feedback.

    Leave a Reply


    The open in MOOC must include the ability to create courses

    March 14th, 2017 by Graham Attwell

    However you view MOOCs, they have been a success in moving towards open education and in allowing thousands of people not enrolled in formal education programmes to take part in courses.

    But in all the talk about open and MOOCs one issue worries me: access to platforms. Yes the best MOOCs and the better platforms encourage conversation between learners and even promote the idea of learners being facilitators. Yet the ability to create a MOOC is largely confined those in a commercial company or those in mainly Higher Education establishments. Increasingly MOOC platforms are only accessible to those who are part of one or another of the consortia which have emerged between different education institutions or those with money to pay into a private MOOC provider. OK, it is possible to hack a MOOC platform together with WordPress or to install Open edX. But it isn’t simple. The Emma project and platform have opened up possibilities to host MOOCs in Europe but I am not sure that this will continue to be supported after their EU funding runs out.

    If we want truly open education, then we need to open up opportunities for creating and facilitating learning as well as participating in a programme. I still like Ivan Illich’s 1971 dream in Deschooling Society of a big computer which could send postcards to match those wanting to learn something with those willing to support them. And I see an open MOOC infrastructure as the way we might achieve this. Of course there are concerns over quality. but surely we can find ways of peer reviewing proposed courses and supporting course creators to achieve not only high quality but truly imaginative pedagogy approaches to learning through a MOOC. Quality is not just predicated on the cost of the video production.

    I wonder if rather than the formation of big consortia, more democratic federation could be the way to go. It is disappointing to see that FutureLearn has announced that those students who fail to pay a fee (or as they put it, an ‘upgrade’ will no longer be able to access content following the end of a course. This is just one more reason why we need an open MOOC infrastructure or ecology if MOOCs are to be truly open.

    Leave a Reply


    More about competency based education

    March 13th, 2017 by Graham Attwell

    Just a quick addendum to my earlier article on competency based education. Things go round and much of my experience was from providing professional development to teachers and trainers for the UK around the introduction of the competency based National Vocational Qualifications (NVQs)in the early 1990s.  At the same time the National Council for Vocational Qualifications, charged with the development of the NVQs established a development group which I was a member of and which held lengthy discussions over issues which arose in trialling and evaluation of the new qualifications. Some of the issues I talked about in the last article including the particularly contentious discussions around knowledge.

    Another issue which caused much debate was that of level. In the past longer courses tended to have a higher level, but since competence based qualifications were supposed to replace the updated time serving involved in earning, this was a non-starter. Some argued that it was illogical to even try to prescribe level to a competence – either someone was competent or they were not. But political pressures meant there must be levels so the pressure then was to find any consistent way of doing it. It is some time since I last looked ta UK vocational qualifications and I do not know how the levels are presently being determined. But back in the 1990s it was essentially determined by the degree of autonomy or responsibility that someone held in a job. If they mostly followed instructions then their competence was at a low level, if they were responsible for managing others their level of competence was much higher. Of course, this led to all kinds of strange anomalies which were in the best traditions of UK pragmatism juggled with until the level of the qualification felt about right.

    I will try and find some of the longer papers I wrote at the time which may still have some relevance for current debates over competency based education.

     

     

    Leave a Reply


    What’s the problem with competency based education?

    March 8th, 2017 by Graham Attwell

    competencybasedlearning

    I got this diagram from a report by Katherine Hockey on the Accelerating Digital Transformation in Higher Education conference, I’ll write more about some of the topics Katherine raises, but for the moment just want to focus on Competency Based Learning.

    Katherine says “Higher education learning may be open to change regarding teaching methods. Competency-based learning is being implemented at UEL, whereby the structure of a course is not linear in the traditional sense: the learner chooses modules at an order and pace that suits them. This aims at increasing employability, and was met initially with reservations but soon became popular with academics.”

    Lets take the first sentence first. It seems to me that there is no doubt that Higher Education in general is open to new teaching methods. There may have been in the past resistance to using technology in education – partly based on a lack f competence and confidence in using technology as part of teaching and learning – but there have always been islands of exciting experimentation and innovation. The question has been how to move out from the islands.

    But it is a big jump to equate openness to change with competency based education. Competency based education itself is hardly new – in the early 1990s the UK reformed its Vocational Education and Training provision to move to competency based qualifications under the National Council for Vocational Qualifications (NCVQ). It was not an unqualified success. And there are disturbing parallels between what NCVQ said at the time and the University of East London’s diagram.

    Firstly is the myth that employers always know best. Just why a qualification developed with employers should be valid, and one developed without employers not is beyond me. The problem with employers is that they tend to look to the present or the short term future in defining skills requirements. Nd there is a difference between the skills that individual employers may require – or even groups of employers – and the wider knowledge and skills required to be flexible and forward looking in employment today. But even then would this be con

    Another problem that beset NVQs was the relationship between ‘competence’ and knowledge and how to define performance to meet such competence. The NVQ system evolved, starting out with bald functional competence statements (yes, developed with employers), but later including ‘performance’ criteria and ‘underpinning knowledge.’ But even then was achievement of these standards considered as ‘mastery’? Some argued that it would be necessary to define the context in which the skills should be evidenced, others that there should be frequent (although how many and how often was never agreed) demonstrations of performance. And then of course there was the question of who is qualified to recognise the University of East London student’s mastery? How is there competence to act as assessors to be defined and assessed?

    One of the big arguments for National Vocational Qualifications was the need to move away from time serving and have personalised and flexible routes whereby individuals could choose what they wanted to learn. In fact, some at NCVQ went further arguing that learning as an activity should be separated from qualifications. Once more few went down this route. Courses continued to be the way to qualifications, although there were a number of (quite expensive) experiments with recognising prior competences.

    I would be deeply suspicious of just what they mean by “tuition model is subscription based”? This seems like just another attempt to package up education for sale in nice chunks: a step forward in the privatisation of education. But if past experiences of competency evangelism are anything to go by, this one will fail.

    Leave a Reply


    Constructing learning

    March 7th, 2017 by Graham Attwell

     

     

     

     

     

     

     

     

     

     

     

     

    Interesting report in the Jisc email. They say:

    “Blended learning (the merging of technology and face-to-face) involves learners in the construction of their own learning. But a recent survey by Sheffield Hallam University showed that there’s inconsistency in learners’ experiences of this – a concern likely shared across the country.

    Students also said that they expected the majority of their learning to be supported by an online platform. As a result, Sheffield Hallam University has created a set of “minimum expectations” for their teaching staff to encourage them to publish learning resources online, give online assessment feedback and use social media for student-staff collaboration.”

    Without having read the full report from Sheffield, I wonder how much learners on blended learning programmes really are involved in the construction of their own learning and how they are supported in that process. It is also interesting to see the university turning to social media for student staff collaboration. Guess I need to read the report!

     

     

    Leave a Reply


    Interpreting and presenting data

    March 1st, 2017 by Graham Attwell

    I have been working on the contents for week 4 of the free #EmployID MOOC on The Changing World of Work, taking place on the European EMMA platform and starting in late March. Week 4 is all about Labour Market Information – or as I prefer to call it, Labour Market Intelligence – and how we can use labour market data both for job seekers and young people choosing careers and by advisers and other professionals working in the careers and labour market domain.

    One of the major challenges is how to represent data. This presentation, Data is beautiful: Techniques, tools and apps for sharing your results by Laura Ennis, provides some good practical advice on how to present data. It come from a talk she did at Leap Into Research 2017.

    Leave a Reply


P1020724P1020699P1020698P1020696P1020692P1020688P1020686P1020681P1020678P1020673P1020669P1020666P1020665P1020614