Archive for the ‘Pedagogy’ Category

Workplace Learning Analytics for Facilitation in European Public Employment Services

April 29th, 2016 by Graham Attwell

This week I have been at the pre-conference workshops for the Learning analytics conference in Edinburgh. This is my presentation at the workshop on Workplace Learning Analytics. And below is the abstract of my paper together with a link to download the full paper, if you should wish. In the next few days,  I will write up a reflection on the workshops, plus some new ideas that emerged from talking with participants.
Abstract

The paper is based on early research and practices in developing workplace Learning Analytics for the EU funded EmployID project, focused on identity transformation and continuing professional development in Public Employment Services (PES) in Europe. Workplace learning is mostly informal with little agreement of proxies for learning, driven by demands of work tasks or intrinsic interests of the learner, by self-directed exploration and social exchange that is tightly connected to processes and the places of work. Rather than focusing on formal learning, LA in PES needs to be based on individual and collective social practices and informal learning and facilitation processes rather than formal education. Furthermore, there are considerable concerns and restraints over the use of data in PES including data privacy and issues including power relations and hierarchies.

Following a consultation process about what innovations PES would like to pilot and what best meets their needs, PES defined priorities for competence advancement around the ‘resourceful learner’, self-reflection and self-efficacy as core competences for their professional identity transformation. The paper describes an approach based on Social Learning Analytics linked to the activities of the EmployID project in developing social learning including advanced coaching, reflection, networking and learning support services. SLA focuses on how learners build knowledge together in their cultural and social settings. In the context of online social learning, it takes into account both formal and informal educational environments, including networks and communities. The final section of the paper reports on work in progress to build a series of tools to embed SLA within communities and practices in PES organisations.

Download the paper (PDF)

Mobile Learning – the Dream goes on

February 29th, 2016 by Graham Attwell

“What killed the mobile learning dream?” asks John Traxler in an article for Jisc’s Digifest. John goes on to say:

Mobile learning was e-learning’s dream come true. It offered the potential for completely personalised learning to be truly any time, anywhere.

ltbInstead, we’ve ended up with mobile access to virtual learning environments that are being used as repositories. So, in practice, students reading their notes on the bus.

He’s right but I am not sure his reasons are sufficient. The main problem John sees is that when early projects were developed into mobile learning, they were based on supplying participants with digital devices. This was expensive and limited the scale and sustainability of such projects. Now new initiatives are emerging based on BYOD (bring your own Device). This is more sustainable but raises its own questions.

Bring your own device, enabling students to use their own equipment, introduces more questions: is there a specific range of technologies they can bring, what’s the nature of the support offered, and have we got a network infrastructure that won’t fall over when 20,000 students turn up with gadgets? What kind of staff development is needed to handle the fact that not only will the students turn up with many different devices but tomorrow they’ll have changed to even more different devices?

All this is true. And as we prepare to roll out the trial of our Learning Layers project funded Learning Toolbox (LTB) application we are only to aware that as well as looking at the technical and pedagogic application of Learning toolbox, we will have to evaluate the infrastructure support. The use of Learning toolbox has been predicated on BYOD and has been developed with Android, iOS and Microsoft versions. The training centre where the pilot will take place with some 70 apprentices, BauABC, covers a large site and is in a rural area. Telecoms network coverage is flaky, broadband not fast and the wireless network installed to support the pilots is a new venture. So many issues for us to look at there. However in terms of staff development I am more confident, with an ongoing programme for the trainers, but perhaps more importantly I think a more open attitude from construction industry trainers to the use of different technologies than say from university lecturers.

The bigger issue though for me is pedagogy. John hints at this when he talks about mobiles being used to access virtual learning environments that are being used as repositories. The real limitation here is not in the technology or infrastructure but a lack of vision of the potential of mobiles for learning in different contexts. Indeed I suspect that the primary school sector is more advanced in their thing here than the universities. Mobile devices have the potential to take learning into the world outside the classroom and to link practical with more theoretical learning. And rather than merely pushing learning (to be read on the bus although I have never quite understood why mobile learning vendors think everyone travels home by bus), the potential is to create a new ecosystem, whereby learners themselves can contribute to the learning of others, by direct interaction and by the sharing of learning and of objects. Dare I say it – Learning Toolbox is a mobile Personal Learning Environment (at least I hope so). We certainly are not looking to replace existing curricula, neither existing learning technologies. Rather we see Learning Toolbox as enhancing learning experiences and allowing users to reflect on learning in practice. In this respect we are aware of the limitations of a limited screen size and also of the lack of attraction of writing long scripts for many vocational learners. This can be an advantage. Mobile devices support all kinds of gesturing (think Tinder) and are naturally used for multimedia including video and photographs.

So what killed the mobile learning dream. Lack of understanding of its true potential, lack of vision and a concentration of funding and pilot activities with the wrong user groups. That is not to say that mobile learning cannot be used in higher education. But it needs a rethinking of curriculum and of the interface between curriculum, pedagogy and the uses of technology. So the dream is not dead. It just needs more working on!

If you would like to know more about Learning Toolbox or are interesting in demonstration or a pilot please contact me graham10 [at] mac [dot] com

The future of learning at work. How technology is influencing working and learning in healthcare: Preparing our students and ourselves for this future

February 16th, 2016 by Graham Attwell

Over the last few weeks I seem to have been bombarded with calls for submissions for conferences. Most I have ignored on the grounds that they are just too expensive. And if I can’t afford them, working as a relatively senior researcher with project funding, what hope do emerging researchers have of persuading their universities or companies to pay. But tto be honest I am bored with most of the conferences. Formal papers, formally presented with perhaps ten or twenty people in a session and very limited time for discussion. We know there are better ways of learning!

One conference I have submitted an abstract to is AMEE. – the International Association for Medical Education. Apart from short communications, research papers and PhD presentations AMEE invites posters, Pecha Kucha, workshops, points of view and organises a fringe to the conference. Sounds good to me and as you might guess I have submitted a point of view. Here goes (in 300 words precisely) ……

The future of work is increasingly uncertain and that goes just as much for healthcare as other occupations. An ageing population is resulting in increasing demand for healthcare workers and advances in technology and science are resulting in new healthcare applications. At the same time technology promises a revolution in self-diagnosis, whilst Artificial Intelligence and robots may render many traditional jobs obsolete.

So what can we say about healthcare skills for the future and what does it mean for healthcare education. Whilst machines may take over more unskilled work, there is likely to be increasing demand for high skilled specialist healthcare workers as well as those caring for the elderly. These staff need to be confident and competent in using existing technologies and adapting to technologies of the future.

They will need to be self-motivated lifelong learners, resilient and capable of coping with changing occupational identities. They will need to collaborate in multidisciplinary teams leading to a high premium on communication skills.

Present processes of education and training based predominantly on face-to-face courses cannot cope with the needs of lifelong learning. Learning needs to be embedded in everyday work processes. Technology is critical here; ubiquitous connectivity and mobile devices allow context-based learning. The same technologies can promote informal and social learning, learning from peers and sharing experience and knowledge in personal learning networks. Already there are many MOOCs dedicated to medical education. Healthcare professionals are using social media to build informal learning networks. But these are the exceptions not the norm. In the future machine learning algorithms can support individuals wishing to deepen their knowledge, VR to share experiences. Yet although there is a rich potential, medical educators have to steer the process. We need to know what works, what doesn’t, to evaluate, to share. That needs to start now!

Workplace Learning Analytics for Facilitation in European Public Employment Services

February 10th, 2016 by Graham Attwell

Along with colleagues from the EmployID project, I’ve submitted  a paper f to the workshop on Learning Analytics for Workplace and Professional Learning (LA for Work) at Learning Analytics and Knowledge Conference (LAK 2016) in April. Below is the text of teh paper (NB If you are interested, the orgnaisers are still accepting submissions for the workshop.

ABSTRACT

In this paper, we examine issues in introducing Learning Analytics (LA) in the workplace. We describe the aims of the European EmployID research project which aims to use

Image: Educause

technology to facilitate identity transformation and continuing professional development in European Public Employment Services. We describe the pedagogic approach adopted by the project based on social learning in practice, and relate this to the concept of Social Learning Analytics. We outline a series of research questions the project is seeking to explore and explain how these research questions are driving the development of tools for collecting social LA data. At the same time as providing research data, these tools have been developed to provide feedback to participants on their workplace learning.

1. LEARNING ANALYTICS AND WORK BASED LEARNING

Learning Analytics (LA) has been defined as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.” [1]. It can assist in informing decisions in education education system, promote personalized learning and enable adaptive pedagogies and practices [2].

However, whilst there has been considerable research and development in LA in the formal school and higher education sectors, much less attention has been paid to the potential of LA for understanding and improving learning in the workplace. There are a number of possible reasons for this.

Universities and schools have tended to harvest existing data drawn from Virtual Learning Environments (VLEs) and to analyse that data to both predict individual performance and undertake interventions which can for instance reduce drop-out rates. The use of VLEs in the workplace is limited and “collecting traces that learners leave behind” [3] may fail to take cognizance of the multiple modes of formal and informal learning in the workplace and the importance of key indicators such as collaboration. Once more key areas such as collaboration tend to be omitted and in focusing on VLEs, a failure to include all the different modes of learning. Ferguson [4]) says that in LA implementation in formal education: “LA is aligned with clear aims and there are agreed proxies for learning.” Critically, much workplace learning is informal with little agreement of proxies for learning. While Learning Analytics in educational settings very often follow a particular pedagogical design, workplace learning is much more driven by demands of work tasks or intrinsic interests of the learner, by self-directed exploration and social exchange that is tightly connected to processes and the places of work [5].  Learning interactions at the workplace are to a large extent informal and practice based and not embedded into a specific and measurable pedagogical scenario.

Pardo and Siemens [6] point out that “LA is a moral practice and needs to focus on understanding instead of measuring.” In this understanding “learners are central agents and collaborators, learner identity and performance are dynamic variables, learning success and performance is complex and multidimensional, data collection and processing needs to be done with total transparency.” This poses particular issues within the workplace with complex social and work structures, hierarchies and power relations.

Despite these difficulties workplace learners can potentially benefit from being exposed to their own and other’s learning processes and outcomes as this potentially allows for better awareness and tracing of learning, sharing experiences, and scaling informal learning practices [5]. LA can, for instance, allow trainers and L & D professionals to assess the usefulness of learning materials, increase their understanding of the workplace learning environment in order to improve the learning environment and to intervene to advise and assist learners. Perhaps more importantly,  it can assist learners in monitoring and understanding their own activities and interactions and participation in individual and collaborative learning processes and help them in reflecting on their learning.

There have been a number of early attempts to utilise LA in the workplace. Maarten de Laat [7] has developed a system based on Social Network Analysis to show patterns of learning and the impact of informal learning in Communities of Practice for Continuing Professional Development for teachers.

There is a growing interest in the use of MOOCs for professional development and workplace learning. Most (if not all) of the major MOOC platforms have some form of Learning Analytics built in providing both feedback to MOOC designers and to learners about their progress. Given that MOOCs are relatively new and are still rapidly evolving, MOOC developers are keen to use LA as a means of improving MOOC programmes.  Research and development approaches into linking Learning Design with Learning Analytics for developing MOOCs undertaken by Conole [8] and Ferguson [9] amongst others have drawn attention to the importance of pedagogy for LA.

Similarly, there are a number of research and development projects around recommender systems and adaptive learning environments. LA is seen as having strong relations to recommender systems [10], adaptive learning environments and intelligent tutoring systems [11]), and the goals of these research areas. Apart from the idea of using LA for automated customisation and adaptation, feeding back LA results to learners and teachers to foster reflection on learning can support self-regulated learning [12]. In the workplace sphere LA could be used to support the reflective practice of both trainers and learners “taking into account aspects like sentiment, affect, or motivation in LA, for example by exploiting novel multimodal approaches may provide a deeper understanding of learning experiences and the possibility to provide educational interventions in emotionally supportive ways.” [13].

One potential barrier to the use of LA in the workplace is limited data. However, although obviously smaller data sets place limitations on statistical processes, MacNeill [14] stresses the importance of fast data, actionable data, relevant data and smart data, rather than big data. LA, she says, should start from research questions that arise from teaching practice, as opposed to the traditional approach of starting analytics based on already collected and available data. Gasevic, Dawson and Siemens [15]  also draw attention to the importance of information seeking being framed within “robust theoretical models of human behavior” [16]. In the context of workplace learning this implies a focus on individual and collective social practices and to informal learning and facilitation processes rather than formal education. The next section of this paper looks at social learning in Public Employment Services and how this can be linked to an approach to workplace LA.

2. EMPLOYID: ASSISTING IDENTITY TRANSFORMATION THROUGH SOCIAL LEARNING IN EUROPEAN EMPLOYMENT SERVICES

The European EmployID research project aims to support and facilitate the learning process of Public Employment Services (PES) practitioners in their professional identity transformation process. The aims of the project are born out of a recognition that to perform successfully in their job they need to acquire a set of new and transversal skills, develop additional competencies, as well as embed a professional culture of continuous improvement. However, it is unlikely that training programmes will be able to provide sufficient opportunities for all staff in public employment services, particularly in a period of rapid change in the nature and delivery of such services and in a period with intense pressure on public expenditures. Therefore, the EmployID project aims to promote, develop and support the efficient use of technologies to provide advanced coaching, reflection and networking services through social learning. The idea of social learning is that people learn through observing others behaviour, attitudes and outcomes of these behaviours, “Most human behaviour is learned observationally through modelling from observing others, one forms an idea of how new behaviours are performed, and on later occasions this coded information serves as a guide for action” [17]. Facilitation is seen as playing a key role in structuring learning and identity transformation activities and to support networking in personal networks, teams and organisational networks, as well as cross-organisational dialogue.

Social Learning initiatives developed jointly between the EmployID project and PES organisations include the use of MOOCs, access to Labour Market information, the development of a platform to support the emergence of communities of practice and tools to support reflection in practice.

Alongside such a pedagogic approach to social learning based on practice the project is developing a strategy and tools based on Social Learning Analytics. Ferguson and Buckingham Shun [18] say that Social Learning Analytics (SLA) can be usefully thought of as a subset of learning analytics approaches. SLA focuses on how learners build knowledge together in their cultural and social settings. In the context of online social learning, it takes into account both formal and informal educational environments, including networks and communities. “As groups engage in joint activities, their success is related to a combination of individual knowledge and skills, environment, use of tools, and ability to work together. Understanding learning in these settings requires us to pay attention to group processes of knowledge construction – how sets of people learn together using tools in different settings. The focus must be not only on learners, but also on their tools and contexts.”

Viewing learning analytics from a social perspective highlights types of analytic that can be employed to make sense of learner activity in a social setting. They go on to introduce five categories of analytic whose foci are driven by the implications of the changes in which we are using social technology for learning [18]. These include social network analysis focusing on interpersonal relations in social platforms, discourse analytics predicated on the use of language as a tool for knowledge negotiation and construction, content analytics particularly looking at user-generated content and disposition analytics saying intrinsic motivation to learn is a defining feature of online social media, and lies at the heart of engaged learning, and innovation.

The approach to Social Learning Analytics links to the core aims of the EmployID project to support and facilitate the learning process of PES practitioners in their professional identity development by the efficient use of technologies to provide social learning including advanced coaching, reflection, networking and learning support services. The project focuses on technological developments that make facilitation services for professional identity transformation cost-effective and sustainable by empowering individuals and organisations to engage in transformative practices, using a variety of learning and facilitation processes.

3. LEARNING ANALYTICS AND EMPLOYID – WHAT ARE WE TRYING TO FIND OUT?

Clearly there are close links between the development of Learning Analytics and our approach to evaluation within EmployID. In order to design evaluation activities the project has developed a number of overarching research questions around professional development and identity transformations with Public Employment Services. One of these research questions is focused on LA: Which forms of workplace learning analytics can we apply in PES and how do they impact the learner? How can learning analytics contribute to evaluate learning interventions? Other focus on the learning environment and the use of tools for reflection, coaching and creativity as well as the role of the wider environment in facilitating professional identity transformation. A third focus is how practitioners manage better their own learning and gain the necessary skills (e.g. self-directed learning skills, career adaptability skills, transversal skills etc.) to support identity transformation processes as well as facilitating the learning of others linking individual, community and organizational learning.

These research questions also provide a high level framework for the development of Learning Analytics, embedded within the project activities and tools. And indeed much of the data collected for evaluation purposes also can inform Learning Analytics and vice versa. However, whilst the main aim of the evaluation work is measure the impact of the EmployID project and for providing useful formative feedback for development of the project’s tools and overarching blended learning approach, the Learning Analytics focus is understanding and optimizing learning and the environments in which it occurs.

4. FROM A THEORETICAL APPROACH TO DEVELOPING TOOLS FOR LA IN PUBLIC EMPLOYMENT SERVICES

Whilst the more practical work is in an initial phase, linked to the roll out of tools and platforms to support learning, a number of tools are under development and will be tested in 2016. Since this work is placed in the particular working environment of public administration, the initial contextual exploration led to a series of design considerations for the suggested LA approaches presented below. The access to fast, actionable, relevant and smart data is most importantly regulated by strict data protection and privacy aspects, that are crucial and clearly play a critical role in any workplace LA. As mentioned above power relations and hierarchies come into play and the full transparency to be aspired with LA might either be hindered by existing structures or raise expectations that are not covered by existing organisations structures and process. If efficient learning at the workplace becomes transparent and visible through intelligent LA, what does this mean with regard to career development and promotion? Who has access to the data, how are they linked to existing appraisal systems or is it perceived as sufficient to use the analytics for individual reflection only? For the following LA tools a trade-off needs to be negotiated and their practicality in workplace setting can only be assessed when fully implemented. Clear rules about who has access to the insight gained from LA have to be defined. The current approach in EmployID is thus to focus on the individual learner as the main recipient of LA.   

4.1 Self-assessment questionnaire

The project has developed a self-assessment questionnaire as an instrument to collect data from EmployID interventions in different PES organisations to support reflection on personal development. It contains a core set of questions for cross-case evaluation and LA on a project level as well as intervention-specific questions that can be selected to fit the context. The self-assessment approach should provide evidence for the impact of EmployID interventions whilst addressing the EmployID research questions, e.g. the effectiveness of a learning environment in the specific workplace context. Questions are related to professional identity transformation, including individual, emotional, relational and practical development. For the individual learner the questionnaire aims to foster their self-reflection process. It supports them in observing their ‘distance travelled’ in key aspects of their professional identity development. Whilst using EmployID platforms and tools, participants will be invited to fill in the questionnaire upon registration and then at periodic intervals. Questions and ways of presenting the questionnaire questions are adapted to the respective tool or platform, such as social learning programmes, reflective community, or peer coaching.

The individual results and distance travelled over the different time points will be visualised and presented to individual participants in the form of development curves based on summary categories to stimulate self-reflection on learning. These development curves show the individual learners’ changes in their attitudes and behaviour related to learning and adaptation  in the job, the facilitation of colleagues and clients, as well as the personal development related to reflexivity, stress management and emotional awareness.

4.2 Learning Analytics and Reflective Communities

The EmployID project is launching a platform to support the development of a Reflective in the Slovenian PES in February, 2016. The platform is based on the WordPress Content Management System and the project has developed a number of plug ins to support social learning analytics and reflection analytics. The data from these plugins can serve as the basis for a dashboard for learners providing visualisations of different metrics

4.2.1 Network Maps

This plugin visualizes user interactions in social networks including individual contacts, activities, and topics. The data is visualised through a series of maps and is localised through different offices within the PES. The interface shows how interaction with other users has changed during the last 30 days. This way users can visually “see” how often they interact with others and possibly find other users with whom they wish to interact.

The view can be filtered by different job roles and is designed to help users find topics they may be interested in.

4.2.2 Karma Points

The Karma Points plugin allows users to give each other ‘Karma points’ and ‘reputation points’. It is based both on rankings of posts and of authors. Karma points are temporary and expire after a week but are also refreshed each week. This way users can only donate karma points to a few selected posts in each week. The user who receives a karma point gets the point added to her permanent reputation points.

4.2.3 Reflection Analytics

The Reflection Analytics plugin collects platform usage data and shows it in an actionable way to users. The purpose of this is to show people information in order to let them reflect about their behaviour in the platform and then possibly to give them enough information to show them how they could learn more effectively. The plugin will use a number of different charts, each wrapped in a widget in order to retain customizability.

One chart being considered would visualise the role of the user’s interaction in the current month in terms of how many posts she wrote, how many topics she commented on and how many topics she read compared to the average of the group.  This way, users can easily identify whether they are writing a similar number of topics as their colleagues. It shows change over time and provides suggestions for new activities. However, we also recognise that comparisons with group averages can be demotivating for some people.

4.3 Content Coding and Analysis

The analysis of comments and content shared within the EmployID tools can provide data addressing a number of the research questions outlined above.

A first trial of content coding used to analyse inputs into a pilot MOOC held in early 2015 using the FutureLearn platform resulted in rich insights about aspects of identity transformation and learning from and with others. The codes for this analysis were created inductively based on [19] and then analysed according to success factors for identity transformation. Given that identity transformation in PES organisations is a new field of research we expect new categories to evolve over time.

In addition to the inductive coding the EmployID project will apply deductive analysis to investigate the reflection in content of the Reflective Community Platform following a fixed coding scheme for reflection [20].

Similar to the coding approach applied for reflective actions we are currently working on a new coding scheme for learning facilitation in EmployID. Based on existing models of facilitation (e.g. [21]) and facilitation requirements identified within the PES organisations, a fixed scheme for coding will be developed and applied the first time for the analysis of content shared in the Reflective Community platform.

An important future aspect of content coding is going one step further and exploring innovative methodological approaches, trialing with a machine learning approach based on (semi-) automatic detection of reflection and facilitation in text. This semi-automatic content analysis is a prerequisite for reflecting analysis back to learners as part of learning analytics, as it allows the analysis of large amounts of shared content, in different languages and not only ex-post, but continually in real time.

4.4 Dynamic Social Network Analysis

Conceptual work being currently undertaken aims to bring together Social Network Analysis and Content Analysis in an evolving environment in order to analyze the changing nature and discontinuities in a knowledge development and usage over time. Such a perspective would not only enable a greater understanding of knowledge development and maturing within communities of practice and other collaborative learning teams, but would allow further development and improvements to the (online) working and learning environment.

The methodology is based on various Machine Learning approaches including content analysis, classification and clustering [22], and statistical modelling of graphs and networks with a main focus on sequential and temporal non-stationary environments [23].

To illustrate changes of nature and discontinuities at the level of social network connectivity and content of communications in a knowledge maturing process “based on the assumption that learning is an inherently social and collaborative activity in which individual learning processes are interdependent and dynamically interlinked with each other: the output of one learning process is input to the next. If we have a look at this phenomenon from a distance, we can observe a knowledge flow across different interlinked individual learning processes. Knowledge becomes less contextualized, more explicitly linked, easier to communicate, in short: it matures.” [24]

5. NEXT STEPS

In this paper we have examined current approaches to Learning Analytics and have considered some of the issues in developing approaches to LA for workplace learning, notably that learning interactions at the workplace are to a large extent informal and practice based and not embedded into a specific and measurable pedagogical scenario. Despite that, we foresee considerable benefits through developing Workplace Learning Analytics in terms of better awareness and tracing of learning, sharing experiences, and scaling informal learning practices.

We have outlined a pedagogic approach to learning in European Public Employment Services based on social learning and have outlined a parallel approach to LA based on Social Learning Analytics. We have described a number of different tools for workplace Learning Analytics aiming at providing data to assist answering a series of research questions developed through the EmployID project. At the same time as providing research data, these tools have been developed to provide feedback to participants on their workplace learning.

The tools are at various stages of development. In the next phase of development, during 2016, we will implement and evaluate the use of these tools, whilst continuing to develop our conceptual approach to Workplace Learning Analytics.

One essential part of this conceptual approach is that supporting learning of individuals with learning analytics is not just as designers of learning solutions how to present dashboards, visualizations and other forms of data representation. The biggest challenge of workplace learning analytics (but also learning analytics in general) is to support learners in making sense of the data analysis:

  1. What does an indicator or a visualization tell about how to improve learning?
  2. What are the limitations of such indicators?
  3. How can we move more towards evidence-based interventions

And this is not just a individual task; it requires collaborative reflection and learning processes. The knowledge of how to use learning analytics results for improving learning also needs to evolve through a knowledge maturing process. This corresponds to Argyris & Schön’s double loop learning [25]. Otherwise, if learning analytics is perceived as a top-down approach pushed towards the learner, it will suffer from the same problems as performance management. These pre-defined indicators (through their selection, computation, and visualization) implement a certain preconception which is not evaluated on a continuous basis by those involved in the process. Misinterpretations and a misled confidence in numbers can disempower learners and lead to an overall rejection of analytics-driven approaches.

ACKNOWLEDGEMENTS

EmployID – “Scalable & cost-effective facilitation of professional identity transformation in public employment services” – is a research project supported by the European Commission under the 7th Framework Program (project no. 619619).

REFERENCES

[1] SoLAR(2011).Open Learning Analytics: An Integrated & Modularized Platform. WhitePaper.Society for Learning Analytics Research. Retrieved from http://solaresearch.org/OpenLearningAnalytics.pdf

[2] Johnson, L. Adams Becker, S., Estrada, V., Freeman, A. (2014). NMC Horizon Report: 2014 Higher Education Edition. Austin, Texas: The New Media Consortium

[3] Duval E. (2012) Learning Analytics and Educational Data Mining, Erik Duval’s Weblog, 30 January 2012,  https://erikduval.wordpress.com/2012/01/30/learning-analytics-and-educational-data-mining/

[4] Ferguson, R. (2012) Learning analytics: drivers, developments and challenges. In: International Journal of Technology Enhanced Learning, 4(5/6), 2012, pp. 304-317.

[5] Ley T. Lindstaedt S., Klamma R. and Wild S. (2015) Learning Analytics for Workplace and Professional Learning, http://learning-layers.eu/laforwork/

[6] Pardo A. and Siemens G. (2014) Ethical and privacy principles for learning analytics in British Journal of Educational Technology Volume 45, Issue 3, pages 438–450, May 2014

[7] de Laat M. & Schreurs (2013) Visualizing Informal Professional Development Networks: Building a Case for Learning Analytics in the Workplace, In American Bahavioral Scientist http://abs.sagepub.com/content/early/2013/03/11/0002764213479364.abstract

[8] Conole G. (2014) The implciations of open practice, presentation, Slideshare, http://www.slideshare.net/GrainneConole/conole-hea-seminar

[9] Ferguson (2015) Learning Design and Learning Analytics, Presentation, Slideshare http://www.slideshare.net/R3beccaF/learning-design-and-learning-analytics-50180031

[10] Adomavicius, G. and Tuzhilin, A. (2005) Toward the Next Generation of Recommender Systems: A Survey of the State-of-the-Art and Possible Extensions. IEEE Transactions on Knowledge and Data Engineering, 17, 734-749. http://dx.doi.org/10.1109/TKDE.2005.99

[11] Brusilovsky, P. and Peylo, C. (2003) Adaptive and intelligent Web-based educational systems. In P. Brusilovsky and C. Peylo (eds.), International Journal of Artificial Intelligence in Education 13 (2-4), Special Issue on Adaptive and Intelligent Web-based Educational Systems, 159-172.

[12] Zimmerman B. J, (2002) Becoming a self-regulated learner: An overview, in Theory into Practice, Volume: 41 Issue: 2 Pages: 64-70

[13] Bahreini K, Nadolski & Westera W. (2014) Towards multimodal emotion recognition in e-learning environments, Interactive Learning environments, Routledge, http://www.tandfonline.com/doi/abs/10.1080/10494820.2014.908927

[14] MacNeill, S. (2015) The sound of learning analytics, presentation, Slideshare, http://www.slideshare.net/sheilamac/the-sound-of-learning-analytics

[15] Gašević, D., Dawson, S., Siemens, G. (2015) Let’s not forget: Learning Analytics are about learning. TechTrends

[16] Wilson, T. D. (1999). Models in information behaviour research. Journal of Documentation, 55 (3), pp 249-70

[17] Bandura, A. (1977). Social Learning Theory. Englewood Cliffs, NJ: Prentice Hall.

[18] Buckingham Shum, S., & Ferguson, R. (2012). Social Learning Analytics. Educational Technology & Society, 15 (3), 3–26

[19] Mayring, P. (2000). Qualitative Content Analysis. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 1(2). Retrieved from http://nbn-resolving.de/urn:nbn:de:0114-fqs0002204

[20] Prilla M, Nolte A, Blunk O, et al (2015) Analyzing Collaborative Reflection Support: A Content Analysis Approach. In: Proceedings of the European Conference on Computer Supported Cooperative Work (ECSCW 2015).   

[21] Hyland, N., Grant, J. M., Craig, A. C., Hudon, M., & Nethery, C. (2012). Exploring Facilitation Stages and Facilitator Actions in an Online/Blended Community of Practice of Elementary Teachers: Reflections on Practice (ROP) Anne Rodrigue Elementary Teachers Federation of Ontario. Copyright© 2012 Shirley Van Nuland and Jim Greenlaw, 71.   

[22] Yeung, K. Y. and Ruzzo W.L. (2000). An empirical study on principal component analysis for clustering gene expression data. Technical report, Department of Computer Science and Engineering, University of Washington.http://bio.cs.washington.edu/supplements/kayee/pca.pdf

[23] Mc Culloh, I. and Carley, K. M. (2008). Social Network Change Detection. Institute for Software Research. School of Computer Science. Carnegie Mellon University. Pittsburgh, PA 15213. CMU-CS-08116.

http://www.casos.cs.cmu.edu/publications/papers/CMU-CS-08-116.pdf

[24] R. Maier, A. Schmidt. Characterizing Knowledge Maturing: A Conceptual Process Model for Integrating E-Learning and Knowledge Management In: Gronau, Norbert (eds.): 4th Conference Professional Knowledge Management – Experiences and Visions (WM ’07), Potsdam, GITO, 2007, pp. 325-334.

http://knowledge-maturing.com/concept/knowledge-maturing-phase-model

[25] Argyris, C./ Schön, D. (1978): Organizational Learning: A theory of action perspective. Reading.

What is the political and social habit(u)s of present day universities?

January 18th, 2016 by Graham Attwell

I like Cristina Costa’s blog, “Is technology changing learning habit(u)s?” (and not only because she cited me). Cristina says how her study on students’ digital practices shows how students’ learning habitus (their histories/experiences with education) have not changed that much in the formal setting, even when they are presented with new pedagogical approaches. It is not so much an issue of their digital competence but an issue that the informal uses of technology do not simply transfer into formal contexts.

Students, she says, “have a feeling for the ‘academic game’ and do their best to adjust to the field’s rules in order to succeed in it.” It seems to me their was always something of a game in academia and especially in undergraduate education. Even in the early 1970s we had well developed strategies for getting through exams (for instance I undertook a rather more in depth study of past exam questions than I did of the overall curriculum and it worked well for me).

But there are more profound contradictions in today’s higher education system. On the one hand universities are supposed to be about education and learning – as expressed through Humboldt’s idea of Allgemeine Bildung—or well-rounded education—to ensure that each person might seek to realize the human potentialities that he possessed as a unique individual or more modern appeals for a broad liberal education (unless such an education can be seen as improving their employability). On the other hand in the UK students are paying substantial fees for a system designed to provide them with a qualification to realise the so called graduate wage premium in the world of work. In such a situation it is little wonder that students are reluctant to participate in the innovative pedagogies – described by Cristina as  Freirean and Deweyan type of pedagogical approaches – designed for them to explore ideas and knowledge – quite simply they want the knowledge and skills they need to pass the exams and thus justify the expenditure. In this situation students will readily adopt productivity apps – office tools, citation databases, revision apps etc – and of course will use technology for social purposes and entertainment. But I am afraid asking them to use social software for learning within the political and social habit(u)s of present day universities may be going to far.

Yishay Mor talks about Design Patterns

January 14th, 2016 by Graham Attwell

At Online Educa Berlin 2015, I had the opportunity to interview Yishay Mor (see podcast below). I was keen to talk to him as he has been one of the people pushing the idea of Design Patterns in technology enhanced learning. And in both the two EU research projects in which I am involved, EmployID and Learning Layers, we are adopting patterns as a design tool or methodology. Both projects from their inception were committed to user centred design. But that left major issues of how to do it. It is not just a matter of getting a group of potential users together and talking with them. We need a language to structure conversations and a language which can describe practice. We have experimented with Personas which I suppose can be described as ideal types. However, all too often the persona ceased to correspond to any reality – or contained a mix of practices from multiple people – rendering them extremely problematic for design purposes.

Design narratives, design patterns and design scenarios seem to offer a potentially richer process for designing for learning, furthermore they may have considerable value in describing innovations in technology. Despite releasing applications as open source, they fail to be picked up on – especially for occupational learning, as the potential uses are opaque.

The following notes are taken from Yishay Mor and Steven Warburton’s paper, ‘Assessing the value of design narratives, patterns and scenarios in scaffolding co-design processes in the domain of technology enhanced learning.

Design narratives provide an account of the history and evolution of a design over time, including the research context, the tools and activities designed, and the results of users’ interactions with these

Design narratives offer thick descriptions of innovations, but they are often too specific to lend themselves to efficient transfer to novel challenges. Design patterns fill this gap by offering a “grounded abstraction” of design knowledge distilled from design narratives. Design patterns originate in the work of Christopher Alexander and his colleagues in the theory of architecture (Alexander, 1977).

A design pattern describes a recurring problem, or design challenge, the characteristics of the context in which it occurs, and a possible method of solution. Patterns are organized into coherent systems called pattern languages where patterns are related to each other. The core of a design pattern can be seen as a local functional statement: “for problem P, under circumstances C, solution S has been known to work.

There are many different ways of describing patterns. In EmployID, reflecting its status as a research project we have adopted the following template:

Problem: What is the learning problem that has been addressed? This encompasses a sufficiently generalized version of a learning scenario

Analysis: Interpretation of the problem from a theory perspective

Context: What are the relevant contextual factors that determine if the proposed solution is actually (and maybe allegedly) successfully applicable?

Solution: What is the (socio-)technical solution?

Evidence: Accumulated evidence that the solution is a solution to the problem when the contextual conditions are met, e.g., examples in a specific context, but also feedback from external stakeholders that problem-solution pairs appear applicable in other contexts.

PLE Special Edition

January 13th, 2016 by Graham Attwell

OK – the ed-tech world moves on to its latest craze. But Personal Learning Environments have not gone away as the new call for papers for the PLE Conference 2015 (Special Edition) makes clear:

Since the emergence of the term Personal Learning Environments (PLE) in the scientific discussion around the year 2004 in Oxford, PLE have become a field of research that has opened up great opportunities for reflection on almost all important aspects of education and learning with technology at all levels; from the study and development of tools, interaction processes among participants, cognitive mechanisms of individual learning, learning in groups and networked learning; life long learning; personal learning networks; even organizational learning environments, and so on.

In these years, the discussion has also transcended the traditional boundaries of academia and has been amplified in both the forms and contexts in which it takes place. The communities created around the concept and practices of PLE, have been responsive not only in the reflections on the realities concerning to learning, but also to the way in which these reflections are made.

Therefore, in this particular special issue supported by the PLE Conference community and its reflections during 2015, we would like to have a compilation of the current state of the field, papers that allow readers to have a vision of what are the most topical ideas and practices around the PLE nowadays, but with a clear vision of where the analysis is headed.

Find the full details here.

First cycle of Multimedia Training in Theme Rooms – Part 3: Preparing digital learning materials for vocational learning

December 5th, 2015 by Pekka Kamarainen

With my two previous posts on the EU-funded Learning Layers (LL) project I have started a series of reports on the Multimedia Training of the training centre Bau-ABC with the concept “Theme Rooms”. The concept was initiated by the training staff of Bau-ABC.  In my first blog I reported, how we (the LL teams of Bau-ABC, ITB and Pontydysgu) adjusted the concept in planning meetings. The second post focused on the work with the theme ‘Social media’. This post focuses on the theme ‘Preparing  Digital Learning Materials’. The groups that that started with this theme gave primary attention to preparation of exemplary digital contents (videos, GoConqr quiz-tests and mindmaps) and put the emphasis on editing processes. The groups that had started with the theme ‘Social Media’ put the main emphasis on working with blogs (and integrating the preparation of digital learning materials to work with blogs). Therefore, the picture of these groups is more differentiated than of the previous ones.

Working with videos and particular GoConqr tools

The two pioneering groups working with digital learning materials put the emphasis on hands-on exercises. In this respect they engaged the participating trainers in producing short videos and preparing exemplary exercises (interim tests) with GoConqr quiz tools. in this way the participants got direct insights into working with these tools. In this context these groups were confronted with the limits provided by ICT infrastructure, old laptops and software bugs. However, in the context of continuing group dynamics these difficulties did not bring the learning processes into standstill. Moreover, the groups used the brainstorming phases to consider the usability of videos and GoConqr applications in training. When continuing to social media, these groups discussed the role of blogs as instruments for presenting such exercises for apprentices.

Working with blogs

The groups that put more emphasis on blogs had somewhat different approaches. In one group the trainers were engaged to create completely new blogs and to use them for posting and commenting messages. In another group the main attention was given on the existing trainers’ blogs that had been created during the earlier Multimedia Training workshops provided by the LL project. As a result, trainers in four occupational domains had started blogs that were used to present comprehensive sets of training materials that covered different contents areas and different phases of apprentice training. When exploring the existing blogs the workshop discussed, how to engage the less represented occupational areas into such work.

However, these explorations and hands-on exercises triggered a lively discussion on the potential benefits and limits of blogs. This discussion was taken up by creating GoConqr Mindmaps that outlined pros and cons of using blogs from different perspectives. In a next step a further question on optimal uses of blogs was again captured with GoConqr mindmaps. This latter step brought more closer to each other the trainers that had created their blogs and the ones that had not been involved. Furthermore, the discussion brought forward the idea of integrated ‘packages’ that link different elements (text documents, photos/drawings/videos, quiz tests and links to further instructions) as building blocks of trainers’ blogs. The results of this discussion were documented in the updates (comments) added to the mindmap.

In this way the work in the workshops not only supported individual learning but provided a basis for discussing the organisational approach to working with web resources and digital learning materials. Such issues were also taken up in the joint concluding session that discussed the progress during the first cycle of workshops. This discussion will be covered in the final post to this series of blogs.

More blogs to come …

 

First cycle of Multimedia Training in Theme Rooms – Part 1: The program takes shape

December 5th, 2015 by Pekka Kamarainen

One of the highlights in the Construction pilot of the EU-funded Learning Layers (LL) project in the year 2015 was the implementation of the Multimedia Training based on the “Theme Room” concept in our application partner organisation Bau-ABC Rostrup. I have already reported on, how the training staff in Bau-ABC developed the concept ‘Theme Rooms’ (and how we integrated it into the LL project approach) in my earlier blogs of June 2015 and September 2015. After an intensive conference period we (the LL teams of Bau-ABC, ITB and Pontydysgu) found the time for planning and preparation in October. In joint planning meetings we adjusted the initial concept and made the related working agreements.

Adjustment of the concept and selecting the themes for the first cycle

The key idea of the Theme Room concept was to arrange four parallel rooms for thematic workshops with small groups. The idea was that the participants will go through a whole cycle through these rooms during four successive Friday afternoons. In addition, the participants should have access to virtual learning spaces of the Theme Rooms. The concept envisaged peer tutors from Bau-ABC  (supported by co-tutors from ITB and Pontydysgu).

In the initial concept the Bau-ABC trainers had proposed four main themes tor

  1. Learning Toolbox as a flexible framework for tools/ apps/ contents/ communication,
  2. Use of Social media as support for learning,
  3. Intellectual property rights, licensing and sharing,
  4. Preparation of Digital Learning Materials.

In the planning meetings we concluded that it would be necessary to start with fewer themes and to have two workshops in the same theme room before moving to the next one. In this way we could ensure that all participants can follow the tempo and achieve sustainable learning results. Also, we considered that it is better to have two workshops with the themes ‘Social Media’ and ‘Preparing Digital Learning Materials’ before introducing the theme ‘Learning Toolbox (LTB)’. Thus, the work with LTB would be introduced in the second cycle of Theme Room workshops. Concerning the theme ‘Intellectual Property Rights’, we agreed to take it up as a transversal theme and to invite the responsible tutor (Dirk Stieglitz, Pontydysgu) to visit the parallel theme rooms to give a brief input on these themes to all groups working in Bau-ABC.

Adjusting the mode of operation of  the Theme Rooms

So, we had come up with a model in which we had  four parallel groups that attended workshops on four successive Fridays in the training centre Bau-ABC in Rostrup and a fifth group working in the affiliated training centre ABZ in Mellendorf. We agreed that two groups in Bau-ABC would start with Social Media and two with Digital Learning Materials. The Mellendorf group also started with Social Media. After two workshops the groups would change to the other priority theme of the first cycle of workshops.

For each group we appointed a tutor from Bau-ABC and a co-tutor from ITB, whilst Dirk Stieglitz from Pontydysgu (responsible for Intellectual property rights) and Jaanika Hirv (TLU, placed for two months in Bau-ABC) were supporting all the groups. During the work we agreed that the groups will work with the same tutors across the themes instead of changing tutors when moving to the next theme.

All these measures and the grouping of participants aimed to help the participants to reach a common overview of the themes and a capability to use social media and prepare digital learning materials (whatever their prior knowledge and skills may have been). Thus, the workshops had the goal to provide all participants active learning opportunities and to create a basi for joint use of new tools and media across the organisation. For the preparation of the workshops and for storage of group results we set up a Google Drive folder tol be updated with contents, learning tasks and links to supporting web resources.

So, in this way we prepared for the start in October. During four Friday afternoons we then worked in the Theme Rooms to make the best of the program.

More blogs to come …

Recognising competence and learning

November 16th, 2015 by Graham Attwell

As promised some further thoughts on the DISCUSS conference, held earlier this week in Munich.

One of the themes for discussion was the recognition of (prior) learning. The theme had emerged after looking at the main work of Europa projects, particularly in the field of lifelong learning. The idea and attraction of recognising learning from different contexts, and particularly form informal learning is hardly new. In the 1990s, in the UK, the National Council for Vocational Qualifications (as it was then called) devoted resources to developing systems for the Accreditation of Prior Learning. One of the ideas behind National Vocational Qualifications was teh decoupling of teaching and learning from learning outcomes, expressed in terms of competences and performance criteria. Therefore, it was thought, anyone should be able to have their competences recognised (through certification) regardless of whether or not they had followed a particular formal training programme. Despite the considerable investment, it was only at best a limited success. Developing observably robust processes for accrediting such learning was problematic, as was the time and cost in implementing such processes.

It is interesting to consider why there is once more an upsurge of interest in the recognition of prior learning. My feeling was in the UK, the initiative wax driven because of teh weak links between vocational education and training and the labour market.n In countries liek Germany, with a strong apprenticeship training system, there was seen as no need for such a procedure. Furthermore learning was linked to the work process, and competence seen as the internalised ability to perform in an occupation, rather than as an externalised series of criteria for qualification. However the recent waves of migration, initially from Eastern Europe and now of refugees, has resulted in large numbers of people who may be well qualified (in all senses of the word) but with no easily recognisable qualification for employment.

I am unconvinced that attempts to formally assess prior competence as a basis for the fast tracking of  awarding qualifications will work. I think we probably need to look much deeper at both ideas around effective practice and at what exactly we mean my recognition and will write more about this in future posts. But digging around in my computer today I came up with a paper I wrote together with Jenny Hughes around some of these issues. I am not sure the title helped attract a wide readership: The role and importance of informal competences in the process of acquisition and transfer of work skills. Validation of competencies – a review of reference models in the light of youth research: United Kingdom. Below is an extract.

“NVQs and the accreditation of informal learning

As Bjørnåvold (2000) says the system of NVQs is, in principle, open to any learning path and learning form and places a particular emphasis on experience-based learning at work, At least in theory, it does not matter how or where you have learned; what matters is what you have learned. The system is open to learning taking place outside formal education and training institutions, or to what Bjørnåvold terms non-formal learning. This learning has to be identified and judged, so it is no coincidence that questions of assessment and recognition have become crucial in the debate on the current status of the NVQ system and its future prospects.

While the NVQ system as such dates back to 1989, the actual introduction of “new” assessment methodologies can be dated to 1991. This was the year the National Council for Vocational Qualifications (NCVQ) and its Scottish equivalent, Scotvec, required that “accreditation of prior learning” should be available for all qualifications accredited by these bodies (NVQs and general national qualifications, GNVQs). The introduction of a specialised assessment approach to supplement the ordinary assessment and testing procedures used when following traditional and formal pathways, was motivated by the following factors:

1. to give formal recognition to the knowledge and skills which people already possess, as a route to new employment;
2. to increase the number of people with formal qualifications;
3. to reduce training time by avoiding repetition of what candidates already know.

The actual procedure applied can be divided into the following steps. The first step consists of providing general information about the APL process, normally by advisers who are not subject specialists, often supported by printed material or videos. The second and most crucial step includes the gathering and preparation of a portfolio. No fixed format for the portfolio has been established but all evidence must be related to the requirements of the target qualification. The portfolio should include statements of job tasks and responsibilities from past or present employers as well as examples (proofs) of relevant “products”. Results of tests or specifically-undertaken projects should also be included. Thirdly, the actual assessment of the candidate takes place. As it is stated:”The assessment process is substantially the same as that which is used for any candidate for an NVQ. The APL differs from the normal assessment process in that the candidate is providing evidence largely of past activity rather than of skills acquired during the current training course.”The result of the assessment can lead to full recognition, although only a minority of candidates have sufficient prior experience to achieve this, In most cases, the portfolio assessment leads to exemption from parts of a programme or course. The attention towards specialised APL methodologies has diminished somewhat in the UK during recent years. It is argued that there is a danger of isolating APL, and rather, it should be integrated into normal assessments as one of several sources of evidence.”The view that APL is different and separate has resulted in evidence of prior learning and achievement being used less widely than anticipated. Assessors have taken steps to avoid this source of evidence or at least become over-anxious about its inclusion in the overall evidence a candidate may have to offer.”We can thus observe a situation where responsible bodies have tried to strike a balance between evidence of prior and current learning as well as between informal and formal learning. This has not been a straightforward task as several findings suggest that APL is perceived as a “short cut”, less rigorously applied than traditional assessment approaches. The actual use of this kind of evidence, either through explicit APL procedures or in other, more integrated ways, is difficult to overview. Awarding bodies are not required to list alternative learning routes, including APL, on the certificate of a candidate. This makes it almost impossible to identify where prior or informal learning has been used as evidence.

As mentioned in the discussions of the Mediterranean and Nordic experiences, the question of assessment methodologies cannot be separated from the question of qualification standards. Whatever evidence is gathered, some sort of reference point must be established. This has become the most challenging part of the NVQ exercise in general and the assessment exercise in particular.We will approach this question indirectly by addressing some of the underlying assumptions of the NVQ system and its translation into practical measures. Currently the system relies heavily on the following basic assumptions: legitimacy is to be assured through the assumed match between the national vocational standards and competences gained at work. The involvement of industry in defining and setting up standards has been a crucial part of this struggle for acceptance, Validity is supposed to be assured through the linking and location of both training and assessment, to the workplace. The intention is to strengthen the authenticity of both processes, avoiding simulated training and assessment situations where validity is threatened. Reliability is assured through detailed specifications of each single qualification (and module). Together with extensive training of the assessors, this is supposed to secure the consistency of assessments and eventually lead to an acceptable level of reliability.

A number of observers have argued that these assumptions are difficult to defend. When it comes to legitimacy, it is true that employers are represented in the above-mentioned leading bodies and standards councils, but several weaknesses of both a practical and fundamental character have appeared. Firstly, there are limits to what a relatively small group of employer representatives can contribute, often on the basis of scarce resources and limited time. Secondly, the more powerful and more technically knowledgeable organisations usually represent large companies with good training records and wield the greatest influence. Smaller, less influential organisations obtain less relevant results. Thirdly, disagreements in committees, irrespective of who is represented, are more easily resolved by inclusion than exclusion, inflating the scope of the qualifications. Generally speaking, there is a conflict of interest built into the national standards between the commitment to describe competences valid on a universal level and the commitment to create as specific and precise standards as possible. As to the questions of validity and reliability, our discussion touches upon drawing up the boundaries of the domain to be assessed and tested. High quality assessments depend on the existence of clear competence domains; validity and reliability depend on clear-cut definitions, domain-boundaries, domain-content and ways whereby this content can be expressed.

As in the Finnish case, the UK approach immediately faced a problem in this area. While early efforts concentrated on narrow task-analysis, a gradual shift towards broader function-analysis had taken place This shift reflects the need to create national standards describing transferable competences. Observers have noted that the introduction of functions was paralleled by detailed descriptions of every element in each function, prescribing performance criteria and the range of conditions for successful performance. The length and complexity of NVQs, currently a much criticised factor, stems from this “dynamic”. As Wolf says, we seem to have entered a “never ending spiral of specifications”. Researchers at the University of Sussex have concluded on the challenges facing NVQ-based assessments: pursuing perfect reliability leads to meaningless assessment. Pursuing perfect validity leads towards assessments which cover everything relevant, but take too much time, and leave too little time for learning. This statement reflects the challenges faced by all countries introducing output or performance-based systems relying heavily on assessments.

“Measurement of competences” is first and foremost a question of establishing reference points and less a question of instruments and tools. This is clearly illustrated by the NVQ system where questions of standards clearly stand out as more important than the specific tools developed during the past decade. And as stated, specific approaches like, “accreditation of prior learning” (APL), and “accreditation of prior experiential learning” (APEL), have become less visible as the NVQ system has settled. This is an understandable and fully reasonable development since all assessment approaches in the NVQ system in principle have to face the challenge of experientially-based learning, i.e., learning outside the formal school context. The experiences from APL and APEL are thus being integrated into the NVQ system albeit to an extent that is difficult to judge. In a way, this is an example of the maturing of the system. The UK system, being one of the first to try to construct a performance-based system, linking various formal and non-formal learning paths, illustrates the dilemmas of assessing and recognising non-formal learning better than most other systems because there has been time to observe and study systematically the problems and possibilities. The future challenge facing the UK system can be summarised as follows: who should take part in the definition standards, how should competence domains be described and how should boundaries be set? When these questions are answered, high quality assessments can materialise.”

  • Search Pontydysgu.org

    Sounds of the Bazaar LIVE from the OEB 2015

    We will broadcast from Berlin on the 3rd and the 4th of December. Both times it will start at 11.00 CET and will go on for about 45 minutes.

    Go here to listen to the radio stream: SoB Online EDUCA 2015 LIVE Radio.

    Or go to our new stream webpage: Sounds of the Bazaar Radio Stream Page

    News Bites

    Teachers and overtime

    According to the TES teachers in the UK “are more likely to work unpaid overtime than staff in any other industry, with some working almost 13 extra hours per week, according to research.

    A study of official figures from the Trades Union Congress (TUC) found that 61.4 per cent of primary school teachers worked unpaid overtime in 2014, equating to 12.9 additional hours a week.

    Among secondary teachers, 57.5 per cent worked unpaid overtime, with an average of 12.5 extra hours.

    Across all education staff, including teachers, teaching assistants, playground staff, cleaners and caretakers, 37.6 per cent worked unpaid overtime – a figure higher than that for any other sector.”


    The future of English Further Education

    The UK Parliament Public Accounts Committee has warned  the declining financial health of many FE colleges has “potentially serious consequences for learners and local economies”.

    It finds funding and oversight bodies have been slow to address emerging financial and educational risks, with current oversight arrangements leading to confusion over who should intervene and when.

    The Report says the Department for Business, Innovation & Skills and the Skills Funding Agency “are not doing enough to help colleges address risks at an early stage”.


    Skills in Europe

    Cedefop is launching a new SKILLS PANORAMA website, online on 1 December at 11.00 (CET).

    Skills Panorama, they say,  turns labour market data and information into useful, accurate and timely intelligence that helps policy-makers decide on skills and jobs in Europe.

    The new website will provide with a more comprehensive and user-friendly central access point for information and intelligence on skill needs in occupations and sectors across Europe. You can register for the launch at Register now at http://skillspanorama.cedefop.europa.eu/launch/.


    Talking about ‘European’ MOOCs

    The European EMMA project is launching a  webinar series. The first is on Tuesday 17 November 2015 from 14:00 – 15:00 CET.

    They say: “In this first webinar we will explore new trends in European MOOCs. Rosanna de Rosa, from UNINA, will present the philosophy and challenges behind the EMMA EU project and MOOC platform developed with the idea of accommodating diversity through multilingualism. Darco Jansen, from EADTU (European Association of Distance Teaching Universities), will talk about Europe’s response to MOOC opportunities. His presentation will highlight the main difference with the U.S. and discuss the consequences for didactical and pedagogical approaches regarding the different contexts.


    Other Pontydysgu Spaces

    • Pontydysgu on the Web

      pbwiki
      Our Wikispace for teaching and learning
      Sounds of the Bazaar Radio LIVE
      Join our Sounds of the Bazaar Facebook goup. Just click on the logo above.

      We will be at Online Educa Berlin 2015. See the info above. The stream URL to play in your application is Stream URL or go to our new stream webpage here SoB Stream Page.

  • Twitter

    RT @ONS Much of the rise in employment relative to 2008 has been among self-employed workers ow.ly/4np3wP pic.twitter.com/CoQxxaNdiX

    About 43 minutes ago from Graham Attwell's Twitter via Twitter for Mac

  • RT @DMLResearchHub Minecraft "CAN be played like a game... but the metaphor of game is no longer useful. It misses the bigger picture." bit.ly/1WFw2it

    About 3 hours ago from Cristina Costa's Twitter via TweetDeck

  • Sounds of the Bazaar AudioBoo

  • Recent Posts

  • Archives

  • Meta

  • Upcoming Events

      There are no events.
  • Categories