Clippings

Clipmarks.com is a very cool service. Here you see our clippings.

Exploring my Personal Learning Environment

March 3rd, 2019 by Graham Attwell

vle ple

It has been a bit busy lately. I seem to be writing one report after another. Anyway I got asked ages ago if a I would make a contribution on Personal Learning Environments for a book to be published in Portugal. I couldn’t resist but of course didn’t get my act together until one week past the final deadline. But then, reading the instructions, I forgot that what I had been invited to contribute was a short concept paper. Instead I wrote and activity sheet. Never mind. Here is the activity and you can have the concept piece tomorrow.

 

Title: Exploring my Personal Learning Environment

Objective

The objective of this activity is for participants to explore their own Personal Learning Environment and to reflect on how they learn.

Participants are encouraged to consider:

  • The different contexts in which they learn
  • The people from whom they learn – their Personal Learning network
  • The ways they use technology in their learning
  • The objects which support their learning
  • The links between what they learn and how they use this learning in their practice
  • The reasons they participate in learning activities
  • What inspires them to reflect on learning in their everyday life
  • How they record their learning, who they share that learning with and why?

Target audience

The main target audience is adults. This includes those in full or part time education, those working or those presently unemployed.

Activity time

The activity can be customised according to available time. It could be undertaken in and hour but could also be extended as part of a half day workshop.

Required features

Flexible space for people to work together and to draw posters. Large sheets of flipchart paper. A flipchart. Felt tip pens. A smartphone camera to record the results.


Schematic sequence of steps for activity

  1. An introduction from the facilitator to the idea of Personal Learning Environments, followed by a short discussion.
  2. An introduction to the activity to be undertaken.
  3. Working individually participants draw a view of their own Personal Learning Environments, including institutions, people, social networks and objects from which they learn.
  4. Short presentations of the posters by participants and questions by colleagues.
  5. Discussion and reflections on the outcomes.

Detailed description of steps – up to 800 characters including spaces

The introduction is critical in setting the context for the activity. Many people will conflate learning with formal education: the introduction needs to make clear we are thinking about all kinds of learning and all of the contexts in which learning takes place.

There needs to be no prescription on how they choose to illustrate their PLE. Some may draw elaborate pictures or diagrammes, others may produce a more traditional list or tree diagramme. In one workshop a participant chose to ‘play his PLE’ on a piano! A variety of different presentations enriches the activity.
While drawing the PLE is is an individual activity it is helpful if the working space encourages conversation and co-refection during the activity.

In my experience, most participants are eager to explain their posters – however this can be time consuming. Sometimes I have introduced voting for the best poster – with a small prize.

The final refection and discussion is perhaps the most important part of the activity in drawing out understandings on how we learn and how we can further develop our PLEs.

 


Bringing Learning Toolbox to wider use in training for construction sector

November 5th, 2018 by Pekka Kamarainen

Last week I returned from my long sick leave. And I had immediately the possibility to attend a working meeting at the training centre Bau-ABC Rostrup. With this training centre we had worked in the EU-funded Learning Layers project many years to develop digital tools to support work process -oriented learning. During the project we reached the stage that Learning Toolbox (LTB) was ready as a viable product to support training and learning processes. The pilot testing in the final phase of the project proved that trainers and apprentices can use the toolset in their training processes. Yet, there were several practical issues that slowed down a wider use of the LTB. Thus, the trainers that had been involved in the pilot testing kept on using the toolset but a wider use was delayed.

Now, in our meeting last week we were facing a new situation. In the meantime most of the hurdles had been overcome and there was full confidence among all parties involved that LTB can be introduced in the apprentice training of Bau-ABC for all trades, Now the pioneering trainers, the management/administration representatives and the LTB developers were discussing, how to support a full-scale implementation of the toolset. From this perspective there was a need to harmonise the use of LTB stacks across the trades and to ensure effective ICT support. Secondly, there was a need to create awareness of good practice in different trades and to share experiences across the trades. In this context the presence of us – researchers from the research institute ITB – was relevant, since we are working in TACCLE projects that support training of trainers and we can draw upon the work in Bau-ABC.

WS-participants 1WS-participants 2WS-participants 3

Insights into the uses of LTB to support training in different trades

Here it is not possible to give a complete overview of all the examples that were presented by Bau-ABC trainers representing different trades. Below, I have selected exemplary cases that show, how the use of LTB had been incorporated into the the work process -oriented learning projects of Bau-ABC apprentices:

  1. Pipeline-builders (Rohrleitungebauer) were using LTB to draft joint plans, how prepare the grounds for the pipelines. Instead of just doing the spadework individually, they made their plans as teams – they divided the tasks and allocated responsibilities for controlling.
  2. Road-builders (Strassenbauer) had prepared a comprehensive overview of the machines provided by the company W&N with nutshell versions of users’ guides (based on the original materials).
  3. Tilers (Fliesenleger) had prepared a comprehensive overview of technical tools that were used in their trade with links to the instructions provided by the manufacturers.
  4. Construction plant operators (Baugeräteführer) had prepared electronic forms as checklists for the inspection of the vehicles before starting to use them. Only after completion of the form and reporting that the vehicles were in order the operators got clearance to start working.
  5. Carpenters (Zimmerer) had been working in a joint project “WorkCamp GreenHouse” with other training centres in Germany. In the project they had developed several modules for ecological construction work (focusing on their trade and the use of materials). In this project they had used LTB as a common toolset and developed a common project plan structure to guide the creation of mother stacks and daughter stacks.
  6. In the area “Health and Safety” (Arbeitssicherheit und Gesundheitsschutz) trainers from different trade had worked together to shape a common stack structure that presents the overarching regulations and the local instructions in the training centre. Within this structure different trades had the possibility to present trade-specific content (e.g. concerning their trade-specific personal safety outfits).
  7. In all trades the apprentices (Auszubildende) were using the LTB to upload photos as progress reports on their work and learning in the projects. The trainers used specific background colours for the tiles that documented apprentices’ work.

LWM-stacks 1LWM-stacks 2LWM-stacks 3

The relevance of the recent progress for apprentice training and vocational learning

If the points that I have listed above are taken only as separate inputs with dedicated tools, it would not appear very “revolutionary”. But the essence of the recent progress is that the trainers are working with an integrative digital toolset – the LTB. They have already used LTB for giving instructions and worksheets for apprentices’ projects. Now, with these newer features the range of using LTB in working and learning contexts is expanding. And – as already mentioned – the trainers are themselves leading the innovation and sharing experience with each other. Moreover, for the apprentices the use of LTB is not just a matter of receiving instructions and reporting of the completion of their tasks. As we have seen it from the examples, the use of LTB requires from them a holistic view on their projects and a professional attitude to completion of the tasks. This has been the spirit of working with the LTB in Bau-ABC.

Now, at this stage we were happy to see that Bau-ABC is organising the wider use of the LTB independently of externally funded projects and within its own organisational frameworks – in collaboration with the LTB developers. And, moreover, Bau-ABC is looking for ways to spread the use of LTB across its professional networks. As we see it, the work of the Learning Layers project bears fruit! We – as accompanying researchers – are happy to observe this also in the future.

More blogs to come …


The problems of assessing competence

February 12th, 2018 by Graham Attwell

It was interesting to read Simon Reddy’s article in FE News,  The Problem with Further Education and Apprenticeship Qualifications, lamenting the low standard of training in plumbing the UK and the problems with the assessment of National Vocational Qualifications.

Simon reported from his research saying:

There were structural pressures on tutors to meet externally-imposed targets and, judging from the majority of tutors’ responses, the credibility of the assessment process was highly questionable.

Indeed, teachers across the three college sites in my study were equally sceptical about the quality of practical plumbing assessments.

Tutors in the study were unanimous in their judgements about college-based training and assessments failing to adequately represent the reality, problems and experiences of plumbers operating in the workplace.

In order to assess the deviation away from the original NVQ rules, he said, “it is important to understand the work of Gilbert Jessup, who was the Architect of UK competence-based qualifications.

Jessup (1991: 27) emphasised ‘the need for work experience to be a valid component of most training which leads to occupational competence’. Moreover, he asserted that occupational competence ‘leads to increased demands for demonstrations of competence in the workplace in order to collect valid evidence for assessment’.

As a representative of the Wesh Joint Education Committee, I worked closely with Gilbert Jessop in the early days of NVQs. Much (probably too much) of our time was taken with debates on the nature of competence and how assessment could be organised. I even wrote several papers about it – sadly in the pre digital age.

But I dug out some of that debate in a paper I wrote with Jenny Hughes for the European ICOVET project which as looking at the accreditation of informal learning. In the paper – with the snappy title ‘The role and importance of informal competences in the process of acquisition and transfer of work skills. Validation of competencies – a review of reference models in the light of youth research: United Kingdom.’

In the introduction we explained the background:

Firstly, in contrast to most countries in continental Europe, the UK has long had a competence based education and training system. The competence based National Vocational Qualifications were introduced in the late 1980s in an attempt to reform and rationalise the myriad of different vocational qualifications on offer. NVQs were seen as separate from delivery systems – from courses and routes to attain competence. Accreditation regulations focused on sufficiency and validity of evidence. From the very early days of the NVQ system, accreditation of prior learning and achievement has been recognised as a legitimate route towards recognition of competence, although implementation of APL programmes has been more problematic. Thus, there are few formal barriers to access to assessment and accreditation of competences. That is not to say the process is unproblematic and this paper will explore some of the issues which have arisen through the implementation of competence based qualifications.

We went on to look at the issue of assessment:

The NVQ framework was based on the notion of occupational competence. The concept of competence has been a prominent, organising principle of the reformed system, but has been much criticised (see, for example, Raggatt & Williams 1999). The competence-based approach replaced the traditional vocational training that was based on the time served on skill formation to the required standard (such as apprenticeships). However, devising a satisfactory method of assessing occupational competence proved to be a contentious and challenging task.

Adults in employment who are seeking to gain an NVQ will need a trained and appointed NVQ assessor. Assessors are appointed by an approved Assessment Centre, and can be in-house employees or external. The assessor will usually help the candidate to identify their current competences, agree on the NVQ level they are aiming for, analyse what they need to learn, and choose activities which will allow them to learn what they need. The activities may include taking a course, or changing their work in some way in order to gain the required evidence of competence. The opportunity to participate in open or distance learning while continuing to work is also an option.

Assessment is normally through on-the-job observation and questioning. Candidates must have evidence of competence in the workplace to meet the NVQ standards, which can include the Accreditation of Prior Learning (APL). Assessors will test the candidates’ underpinning knowledge, understanding and work-based performance. The system is now intended to be flexible, enabling new ways of learning to be used immediately without having to take courses.

The system is characterised by modular-based components and criterion-referenced assessment. Bjornavald also argues that the NVQ framework is output-oriented and performance-based.

We outlined criticisms of the NVQ assessment process

The NCVQ methods of assessing competence within the workplace were criticised for being too narrow and job-specific (Raggatt & Williams 1999). The initial NVQs were also derided for applying ‘task analysis’ methods of assessment that relied on observation of specific, job-related task performance. Critics of NVQs argued that assessment should not just focus on the specific skills that employers need, but should also encompass knowledge and understanding, and be more broadly based and flexible. As Bjornavald argues, ‘the UK experiences identify some of these difficulties balancing between too general and too specific descriptions and definitions of competence’. The NVQs were also widely perceived to be inferior qualifications within the ‘triple-track’ system, particularly in relation to academic qualifications (Wolf 1995; Raffe et al 2001; Raggatt 1999).

The initial problems with the NVQ framework were exacerbated by the lack of regulatory powers the NCVQ held (Evans, 2001). The system was criticized early on for inadequate accountability and supervision in implementation (Williams 1999), as well as appearing complex and poorly structured (Raffe et al 2001).

We later looked at systems for the Accreditation of Prior Learning (APL).

Currently the system relies heavily on the following basic assumptions: legitimacy is to be assured through the assumed match between the national vocational standards and competences gained at work. The involvement of industry in defining and setting up standards has been a crucial part of this struggle for acceptance, Validity is supposed to be assured through the linking and location of both training and assessment, to the workplace. The intention is to strengthen the authenticity of both processes, avoiding simulated training and assessment situations where validity is threatened. Reliability is assured through detailed specifications of each single qualification (and module). Together with extensive training of the assessors, this is supposed to secure the consistency of assessments and eventually lead to an acceptable level of reliability.

A number of observers have argued that these assumptions are difficult to defend. When it comes to legitimacy, it is true that employers are represented in the above-mentioned leading bodies and standards councils, but several weaknesses of both a practical and fundamental character have appeared. Firstly, there are limits to what a relatively small group of employer representatives can contribute, often on the basis of scarce resources and limited time. Secondly, the more powerful and more technically knowledgeable organisations usually represent large companies with good training records and wield the greatest influence. Smaller, less influential organisations obtain less relevant results. Thirdly, disagreements in committees, irrespective of who is represented, are more easily resolved by inclusion than exclusion, inflating the scope of the qualifications. Generally speaking, there is a conflict of interest built into the national standards between the commitment to describe competences valid on a universal level and the commitment to create as specific and precise standards as possible. As to the questions of validity and reliability, our discussion touches upon drawing up the boundaries of the domain to be assessed and tested. High quality assessments depend on the existence of clear competence domains; validity and reliability depend on clear-cut definitions, domain-boundaries, domain-content and ways whereby this content can be expressed.

It’s a long time since I have looked at the evolution of National Vocational Qualifications and the issues of assessment. My guess is that the original focus on the validity of assessment was too difficult to implementing practice, especially given the number of competences. And the distinction between assessing competence and assessing underpinning knowledge was also problematic. Easier to move to multiple choice computerized testing, administered through colleges. If there was a need to assess practical competences, then once more it would be much simpler to assess this in a ‘simulated’ workshop environment than the original idea that competence would be assessed in the real workplace.  At the same time the system was too complicated. Instead of trusting workplace trainers to know whether an apprentice was competent, assessors were themselves required to follow a (competence based) assessors course. That was never going to work in the real world and neither was visiting external assessors going to deliver the validity Gilbert Jessop dreamed of.

If anyone would like a copy the paper this comes from just email me (or add a request in the comments below). Meanwhile I am going to try to find another paper I wrote with Jenny Hughes, looking at some of the more theoretical issues around assessment.

 

 

 

 

 

 


Data, expenditure and the quality of Higher Education

September 12th, 2017 by Graham Attwell

oecd student spendingIn this brave new data world, we seem to get daily reports on the latest statistics about education. It si not easy making sense of it all.

Times Higher Education reports on OECD’s latest Education at a Glance report, an annual snapshot of the state of education across the developed world, published on 12 September.

It shows spending per higher education student significantly falling behind the OECD average in a number of European countries such as Spain, Italy, Slovenia and Portugal, while even countries with reputations for strong university systems, such as Germany and Finland, are failing to keep pace with the US and UK.

But what does all this mean? Germany has significantly increased University places as a response to the crisis, seemingly without spending per student keeping pace. The UK has increased spedning er student. The different of course is that while higher education is basically free in Germany, the UK has some of the highest university tuition fees in the world. Andreas Schleicher from OECD said that since there were no comparable data on learning outcomes for different countries, it was difficult to pinpoint whether the large per-student spends in some nations were actually improving quality.

However, according to the THE report, he added that results from the OECD’s international school testing programme – the Programme for International Student Assessment (Pisa) – showed “that there is essentially no relationship between spending per student and school performance once you get beyond a certain threshold in spending”, a point that most OECD countries had already passed.

Of course school performance does not necessarity equate with quality of teaching and learning. But it does suggest that even with the deluge of data we still do not understand how to judge quality – still ess hwo to improve it.


What comes after “Learning Layers”? – Part Three: Getting deeper with vocational learning, ‘health and safety’ and digital media

April 3rd, 2017 by Pekka Kamarainen

In my two previous blogs I referred to the fact that our EU-funded Learning Layers (LL) project had come to an end and that we (the ITB team involved in the construction sector pilot) are working with follow-up activities. I then described briefly, how I came to start a joint initiative on digital media in the area of ‘health and safety’ (Arbeitssicherheit und Gesundheitsschutz) with trainers of the training centre Bau-ABC. In my previous post I sketched the initiative roughly. Now – after our second meeting – I can give more information and I need to reflect on lessons  learned already at this stage.

Looking back – the achievements with the Learning Layers project

Firstly I need to remind myself how this initiative drew upon the achievements of the LL project. During the project some of the trainers had created WordPress blogs to present their training contents (Project instructions, support material and worksheets) to apprentices in their trades. Then, we had piloted the integrative toolset Learning Toolbox (LTB) that had been developed during the project to support learning in the context of work. The trainers had found their ways to create stacks and tiles to support the apprentices’ projects (based on working & learning tasks). However, the transversal learning area ‘health and safety’ had not yet been covered during the project. And – moreover – from the perspective of promoting the use of LTB and digital media in construction sector, this area is important both for training centres and for construction companies. So, we started working together to conquer this terrain.

Mapping learning materials for ‘health safety’ – filling the gaps and reflecting on pedagogy

I had initially thought that we could proceed rather quickly by mapping the existing material that is being used and by analysing some options for learning software – then to start working with appropriate learning designs. But it struck me that I  had not thought of a necessary interim step – pedagogic reflection on the applicability of existing materials for the learning processes of apprentices and skilled workers. When discussing the potentially applicable learning materials the trainers informed me of several gaps to be overcome. Firstly, a lot of the reference materials are lengthy documents with detailed references to norms, standards and regulations. These, obviously, are not very easily usable in action-oriented learning (supported by digital media. Secondly, several checklists and work sheets for risk analysis (Gefährdungsbeurteilung) are designed for real work situations (involving skilled workers). However, for apprentices who are learning and working in the training centre the trainers need to develop adjusted versions. So, therefore, our initiative needed space and time – and digital tools – for such pedagogigic reflection. Furthermore, the trainers saw a possibility to shape an integrative approach that proceeds from general starting points through the main areas of construction know-how (Tiefbau, Hochbau, Ausbau) and special areas (Brunnenbau, Maschinen- und Metalltechnik) to specific trades (carpentry, bricklaying etc.) and to specific work processes (welding, sawing etc.). So, instead of taking this as an easy ‘packaging content to digital media’ exercise, we are in deep discussion on vocational learning and on appropriate ways to introduce digital media and know-how on ‘health and safety’ into working and learning processes.

– – –

I think this is enough for the moment. I have learned a lot and the trainers are pleased to work in this direction. And as far as I am concerned, this kind of process confirms once again the fundamental principles that we applied in the LL project – orientation to ‘work process knowledge’ and to ‘action-oriented learning’. Now I will have a holiday break but I am looking forward to continuing my work with the Bau-ABC trainers.

More blogs to come …


Learning Analytics and the Peak of Inflated Expectations

January 15th, 2017 by Graham Attwell

hypecycleHas Learning Analytics dropped of the peak of inflated expectations in Gartner’s hype cycle?  According to Educause ‘Understanding the power of data’ is still there as a major trend in higher education and Ed Tech reports a KPMG survey which found that 41 percent of universities were using data for forecasting and predictive analytics.

But whilst many universities are exploring how data can be used to improve retention and prevent drop outs, there seems little pretence any more that Learning Analytics has much to do with learning. The power of data has somehow got muddled up with Management Analytics, Performance Analytics and all kinds of other analytics – but the learning seems to have been lost. Data mining is great but it needs a perspective on just what we are trying to find out.

I don’t think Learning analytics will go into the trough of despair. But i think that there are very real problems in working out how best we can use data – and particularly how we can use  data to support learning. Learning analytics need to be more solidly grounded in what is already known about teaching and learning. Stakeholders, including teachers, learners and the wider community, need to be involved in the development and implementation of learning analytics tools. Overall, more evidence is needed to show which approaches work in practice and which do not.

Finally, we already know a great deal about formal learning in institutions, or at least by now we should do. Of course we need to work at making it better. But we know far less about informal learning and learning which takes place in everyday living and working environments. And that is where I ultimately see Learning analytics making a big difference. Learning Analytics could potentially help us all to self directed learners and to achieve the learning goals that we set ourselves. But that is a long way off. Perhaps if Learning analytics is falling off the peak of expectations that will provide the space for longer term more clearly focused research and development.

 


  • Search Pontydysgu.org

    News Bites

    Zero Hours Contracts

    Figures from the UK Higher Education Statistics Agency show that in total almost 11,500 people – both academics and support staff – working in universities on a standard basis were on a zero-hours contract in 2017-18, out of a total staff head count of about 430,000, reports the Times Higher Education.  Zero-hours contract means the employer is not obliged to provide any minimum working hours

    Separate figures that only look at the number of people who are employed on “atypical” academic contracts (such as people working on projects) show that 23 per cent of them, or just over 16,000, had a zero-hours contract.


    Resistance decreases over time

    Interesting research on student centered learning and student buy in, as picked up by an article in Inside Higher Ed. A new study published in PLOS ONE, called “Knowing Is Half the Battle: Assessments of Both Student Perception and Performance Are Necessary to Successfully Evaluate Curricular Transformation finds that student resistance to curriculum innovation decreases over time as it becomes the institutional norm, and that students increasingly link active learning to their learning gains over time


    Postgrad pressure

    Research published this year by Vitae and the Institute for Employment Studies (IES) and reported by the Guardian highlights the pressure on post graduate students.

    “They might suffer anxiety about whether they deserve their place at university,” says Sally Wilson, who led IES’s contribution to the research. “Postgraduates can feel as though they are in a vacuum. They don’t know how to structure their time. Many felt they didn’t get support from their supervisor.”

    Taught students tend to fare better than researchers – they enjoy more structure and contact, says Sian Duffin, student support manager at Arden University. But she believes anxiety is on the rise. “The pressure to gain distinction grades is immense,” she says. “Fear of failure can lead to perfectionism, anxiety and depression.”


    Teenagers online in the USA

    According to Pew Internet 95% of teenagers in the USA now report they have a smartphone or access to one. These mobile connections are in turn fueling more-persistent online activities: 45% of teens now say they are online on a near-constant basis.

    Roughly half (51%) of 13 to 17 year olds say they use Facebook, notably lower than the shares who use YouTube, Instagram or Snapchat.

    The survey also finds there is no clear consensus among teens about the effect that social media has on the lives of young people today. Minorities of teens describe that effect as mostly positive (31%) or mostly negative (24%), but the largest share (45%) says that effect has been neither positive nor negative.


    Other Pontydysgu Spaces

    • Pontydysgu on the Web

      pbwiki
      Our Wikispace for teaching and learning
      Sounds of the Bazaar Radio LIVE
      Join our Sounds of the Bazaar Facebook goup. Just click on the logo above.

      We will be at Online Educa Berlin 2015. See the info above. The stream URL to play in your application is Stream URL or go to our new stream webpage here SoB Stream Page.

  • Twitter

  • RT @pete_wh For those interested in university architecture and open plan offices, my thesis Whitton, Peter David (2018)The new university: space, place and identity. Doctoral thesis (PhD), Manchester Metropolitan University. is available from MMu's e-space at ... e-space.mmu.ac.uk/620806/1/Ph…

    About 4 days ago from Cristina Costa's Twitter via Twitter for Android

  • Sounds of the Bazaar AudioBoo

  • Recent Posts

  • Archives

  • Meta

  • Upcoming Events

      There are no events.
  • Categories