Archive for the ‘uncategorized’ Category

Exploring my Personal Learning Environment

March 3rd, 2019 by Graham Attwell

vle ple

It has been a bit busy lately. I seem to be writing one report after another. Anyway I got asked ages ago if a I would make a contribution on Personal Learning Environments for a book to be published in Portugal. I couldn’t resist but of course didn’t get my act together until one week past the final deadline. But then, reading the instructions, I forgot that what I had been invited to contribute was a short concept paper. Instead I wrote and activity sheet. Never mind. Here is the activity and you can have the concept piece tomorrow.

 

Title: Exploring my Personal Learning Environment

Objective

The objective of this activity is for participants to explore their own Personal Learning Environment and to reflect on how they learn.

Participants are encouraged to consider:

  • The different contexts in which they learn
  • The people from whom they learn – their Personal Learning network
  • The ways they use technology in their learning
  • The objects which support their learning
  • The links between what they learn and how they use this learning in their practice
  • The reasons they participate in learning activities
  • What inspires them to reflect on learning in their everyday life
  • How they record their learning, who they share that learning with and why?

Target audience

The main target audience is adults. This includes those in full or part time education, those working or those presently unemployed.

Activity time

The activity can be customised according to available time. It could be undertaken in and hour but could also be extended as part of a half day workshop.

Required features

Flexible space for people to work together and to draw posters. Large sheets of flipchart paper. A flipchart. Felt tip pens. A smartphone camera to record the results.


Schematic sequence of steps for activity

  1. An introduction from the facilitator to the idea of Personal Learning Environments, followed by a short discussion.
  2. An introduction to the activity to be undertaken.
  3. Working individually participants draw a view of their own Personal Learning Environments, including institutions, people, social networks and objects from which they learn.
  4. Short presentations of the posters by participants and questions by colleagues.
  5. Discussion and reflections on the outcomes.

Detailed description of steps – up to 800 characters including spaces

The introduction is critical in setting the context for the activity. Many people will conflate learning with formal education: the introduction needs to make clear we are thinking about all kinds of learning and all of the contexts in which learning takes place.

There needs to be no prescription on how they choose to illustrate their PLE. Some may draw elaborate pictures or diagrammes, others may produce a more traditional list or tree diagramme. In one workshop a participant chose to ‘play his PLE’ on a piano! A variety of different presentations enriches the activity.
While drawing the PLE is is an individual activity it is helpful if the working space encourages conversation and co-refection during the activity.

In my experience, most participants are eager to explain their posters – however this can be time consuming. Sometimes I have introduced voting for the best poster – with a small prize.

The final refection and discussion is perhaps the most important part of the activity in drawing out understandings on how we learn and how we can further develop our PLEs.

 

The problems of assessing competence

February 12th, 2018 by Graham Attwell

It was interesting to read Simon Reddy’s article in FE News,  The Problem with Further Education and Apprenticeship Qualifications, lamenting the low standard of training in plumbing the UK and the problems with the assessment of National Vocational Qualifications.

Simon reported from his research saying:

There were structural pressures on tutors to meet externally-imposed targets and, judging from the majority of tutors’ responses, the credibility of the assessment process was highly questionable.

Indeed, teachers across the three college sites in my study were equally sceptical about the quality of practical plumbing assessments.

Tutors in the study were unanimous in their judgements about college-based training and assessments failing to adequately represent the reality, problems and experiences of plumbers operating in the workplace.

In order to assess the deviation away from the original NVQ rules, he said, “it is important to understand the work of Gilbert Jessup, who was the Architect of UK competence-based qualifications.

Jessup (1991: 27) emphasised ‘the need for work experience to be a valid component of most training which leads to occupational competence’. Moreover, he asserted that occupational competence ‘leads to increased demands for demonstrations of competence in the workplace in order to collect valid evidence for assessment’.

As a representative of the Wesh Joint Education Committee, I worked closely with Gilbert Jessop in the early days of NVQs. Much (probably too much) of our time was taken with debates on the nature of competence and how assessment could be organised. I even wrote several papers about it – sadly in the pre digital age.

But I dug out some of that debate in a paper I wrote with Jenny Hughes for the European ICOVET project which as looking at the accreditation of informal learning. In the paper – with the snappy title ‘The role and importance of informal competences in the process of acquisition and transfer of work skills. Validation of competencies – a review of reference models in the light of youth research: United Kingdom.’

In the introduction we explained the background:

Firstly, in contrast to most countries in continental Europe, the UK has long had a competence based education and training system. The competence based National Vocational Qualifications were introduced in the late 1980s in an attempt to reform and rationalise the myriad of different vocational qualifications on offer. NVQs were seen as separate from delivery systems – from courses and routes to attain competence. Accreditation regulations focused on sufficiency and validity of evidence. From the very early days of the NVQ system, accreditation of prior learning and achievement has been recognised as a legitimate route towards recognition of competence, although implementation of APL programmes has been more problematic. Thus, there are few formal barriers to access to assessment and accreditation of competences. That is not to say the process is unproblematic and this paper will explore some of the issues which have arisen through the implementation of competence based qualifications.

We went on to look at the issue of assessment:

The NVQ framework was based on the notion of occupational competence. The concept of competence has been a prominent, organising principle of the reformed system, but has been much criticised (see, for example, Raggatt & Williams 1999). The competence-based approach replaced the traditional vocational training that was based on the time served on skill formation to the required standard (such as apprenticeships). However, devising a satisfactory method of assessing occupational competence proved to be a contentious and challenging task.

Adults in employment who are seeking to gain an NVQ will need a trained and appointed NVQ assessor. Assessors are appointed by an approved Assessment Centre, and can be in-house employees or external. The assessor will usually help the candidate to identify their current competences, agree on the NVQ level they are aiming for, analyse what they need to learn, and choose activities which will allow them to learn what they need. The activities may include taking a course, or changing their work in some way in order to gain the required evidence of competence. The opportunity to participate in open or distance learning while continuing to work is also an option.

Assessment is normally through on-the-job observation and questioning. Candidates must have evidence of competence in the workplace to meet the NVQ standards, which can include the Accreditation of Prior Learning (APL). Assessors will test the candidates’ underpinning knowledge, understanding and work-based performance. The system is now intended to be flexible, enabling new ways of learning to be used immediately without having to take courses.

The system is characterised by modular-based components and criterion-referenced assessment. Bjornavald also argues that the NVQ framework is output-oriented and performance-based.

We outlined criticisms of the NVQ assessment process

The NCVQ methods of assessing competence within the workplace were criticised for being too narrow and job-specific (Raggatt & Williams 1999). The initial NVQs were also derided for applying ‘task analysis’ methods of assessment that relied on observation of specific, job-related task performance. Critics of NVQs argued that assessment should not just focus on the specific skills that employers need, but should also encompass knowledge and understanding, and be more broadly based and flexible. As Bjornavald argues, ‘the UK experiences identify some of these difficulties balancing between too general and too specific descriptions and definitions of competence’. The NVQs were also widely perceived to be inferior qualifications within the ‘triple-track’ system, particularly in relation to academic qualifications (Wolf 1995; Raffe et al 2001; Raggatt 1999).

The initial problems with the NVQ framework were exacerbated by the lack of regulatory powers the NCVQ held (Evans, 2001). The system was criticized early on for inadequate accountability and supervision in implementation (Williams 1999), as well as appearing complex and poorly structured (Raffe et al 2001).

We later looked at systems for the Accreditation of Prior Learning (APL).

Currently the system relies heavily on the following basic assumptions: legitimacy is to be assured through the assumed match between the national vocational standards and competences gained at work. The involvement of industry in defining and setting up standards has been a crucial part of this struggle for acceptance, Validity is supposed to be assured through the linking and location of both training and assessment, to the workplace. The intention is to strengthen the authenticity of both processes, avoiding simulated training and assessment situations where validity is threatened. Reliability is assured through detailed specifications of each single qualification (and module). Together with extensive training of the assessors, this is supposed to secure the consistency of assessments and eventually lead to an acceptable level of reliability.

A number of observers have argued that these assumptions are difficult to defend. When it comes to legitimacy, it is true that employers are represented in the above-mentioned leading bodies and standards councils, but several weaknesses of both a practical and fundamental character have appeared. Firstly, there are limits to what a relatively small group of employer representatives can contribute, often on the basis of scarce resources and limited time. Secondly, the more powerful and more technically knowledgeable organisations usually represent large companies with good training records and wield the greatest influence. Smaller, less influential organisations obtain less relevant results. Thirdly, disagreements in committees, irrespective of who is represented, are more easily resolved by inclusion than exclusion, inflating the scope of the qualifications. Generally speaking, there is a conflict of interest built into the national standards between the commitment to describe competences valid on a universal level and the commitment to create as specific and precise standards as possible. As to the questions of validity and reliability, our discussion touches upon drawing up the boundaries of the domain to be assessed and tested. High quality assessments depend on the existence of clear competence domains; validity and reliability depend on clear-cut definitions, domain-boundaries, domain-content and ways whereby this content can be expressed.

It’s a long time since I have looked at the evolution of National Vocational Qualifications and the issues of assessment. My guess is that the original focus on the validity of assessment was too difficult to implementing practice, especially given the number of competences. And the distinction between assessing competence and assessing underpinning knowledge was also problematic. Easier to move to multiple choice computerized testing, administered through colleges. If there was a need to assess practical competences, then once more it would be much simpler to assess this in a ‘simulated’ workshop environment than the original idea that competence would be assessed in the real workplace.  At the same time the system was too complicated. Instead of trusting workplace trainers to know whether an apprentice was competent, assessors were themselves required to follow a (competence based) assessors course. That was never going to work in the real world and neither was visiting external assessors going to deliver the validity Gilbert Jessop dreamed of.

If anyone would like a copy the paper this comes from just email me (or add a request in the comments below). Meanwhile I am going to try to find another paper I wrote with Jenny Hughes, looking at some of the more theoretical issues around assessment.

 

 

 

 

 

 

Data, expenditure and the quality of Higher Education

September 12th, 2017 by Graham Attwell

oecd student spendingIn this brave new data world, we seem to get daily reports on the latest statistics about education. It si not easy making sense of it all.

Times Higher Education reports on OECD’s latest Education at a Glance report, an annual snapshot of the state of education across the developed world, published on 12 September.

It shows spending per higher education student significantly falling behind the OECD average in a number of European countries such as Spain, Italy, Slovenia and Portugal, while even countries with reputations for strong university systems, such as Germany and Finland, are failing to keep pace with the US and UK.

But what does all this mean? Germany has significantly increased University places as a response to the crisis, seemingly without spending per student keeping pace. The UK has increased spedning er student. The different of course is that while higher education is basically free in Germany, the UK has some of the highest university tuition fees in the world. Andreas Schleicher from OECD said that since there were no comparable data on learning outcomes for different countries, it was difficult to pinpoint whether the large per-student spends in some nations were actually improving quality.

However, according to the THE report, he added that results from the OECD’s international school testing programme – the Programme for International Student Assessment (Pisa) – showed “that there is essentially no relationship between spending per student and school performance once you get beyond a certain threshold in spending”, a point that most OECD countries had already passed.

Of course school performance does not necessarity equate with quality of teaching and learning. But it does suggest that even with the deluge of data we still do not understand how to judge quality – still ess hwo to improve it.

Learning Analytics and the Peak of Inflated Expectations

January 15th, 2017 by Graham Attwell

hypecycleHas Learning Analytics dropped of the peak of inflated expectations in Gartner’s hype cycle?  According to Educause ‘Understanding the power of data’ is still there as a major trend in higher education and Ed Tech reports a KPMG survey which found that 41 percent of universities were using data for forecasting and predictive analytics.

But whilst many universities are exploring how data can be used to improve retention and prevent drop outs, there seems little pretence any more that Learning Analytics has much to do with learning. The power of data has somehow got muddled up with Management Analytics, Performance Analytics and all kinds of other analytics – but the learning seems to have been lost. Data mining is great but it needs a perspective on just what we are trying to find out.

I don’t think Learning analytics will go into the trough of despair. But i think that there are very real problems in working out how best we can use data – and particularly how we can use  data to support learning. Learning analytics need to be more solidly grounded in what is already known about teaching and learning. Stakeholders, including teachers, learners and the wider community, need to be involved in the development and implementation of learning analytics tools. Overall, more evidence is needed to show which approaches work in practice and which do not.

Finally, we already know a great deal about formal learning in institutions, or at least by now we should do. Of course we need to work at making it better. But we know far less about informal learning and learning which takes place in everyday living and working environments. And that is where I ultimately see Learning analytics making a big difference. Learning Analytics could potentially help us all to self directed learners and to achieve the learning goals that we set ourselves. But that is a long way off. Perhaps if Learning analytics is falling off the peak of expectations that will provide the space for longer term more clearly focused research and development.

 

Online disinhibition and the ethics of researching groups on Facebook

April 19th, 2016 by Graham Attwell

There seems to be a whole spate of papers, blogs and reports published lately around MOOCs, Learning Analytics and the use of Labour Market Information. One possibly reason is that it takes some time for more considered research to be commissioned, written and published around emerging themes and technologies in teaching and learning. Anyway I’ve spent an interesting time reading at least some of these latest offerings and will try to write up some notes on what (I think) they are saying and mean.

One report I particular liked is ‘A murky business: navigating the ethics of educational
research in Facebook groups” by Tony Coughlan and Leigh-Anne Perryman. The article, published in the European Journal of Open, Distance and e-Learning, is based on a reflection of their own experiences of researching in Facebook. And as they point out any consideration of ethical practices will almost inevitably run foul of Facebook’s Terms and Condition of Service.

Not withstanding that issue, they summarise the problems as “whether/how to gain informed consent in a public setting; the need to navigate online disinhibition and confessional activity; the need to address the ethical challenges involved in triangulating data collected from social media settings with data available from other sources; the need to consider the potential impact on individual research participants and entire online communities of reporting research findings, especially when published reports are open access; and, finally, the use of visual evidence and its anonymisation.”

Although obviously the use of social networks and Facebook in particular raise their own issues, many of the considerations are more widely applicable to Learning Analytics approaches, especially to using discourse analysis and Social Network Analytics> This discussion came up at the recent EmployID project review meeting. The project is developing a  number of tools and approaches to Workplace Learning Analytics and one idea was that we should attempt to develop a Code of Practice for Learning Analytics in the workplace, similar to the work by Jisc who have published a Code of Practice for Learning Analytics in UK educational institutions.

As an aside, I particularly liked the section on “confessional’ activity’ and ‘online disinhibition’ based on work by Suler (2004) who identified six factors as prompting people to self-disclose online more frequently or intensely than they would in person:

  • Dissociative anonymity – the fact that ‘when people have the opportunity to separate their actions online from their in-person lifestyle and identity, they feel less vulnerable about self-disclosing and acting out’;

  • Invisibility – overlapping, but extending beyond anonymity, physical invisibility ‘amplifies the disinhibition effect’ as ‘people don’t have to worry about how they look or sound when they type a message’ nor about ‘howothers look or sound in response to what they say’;

  • Asynchronicity – not having to immediately deal with someone else’s reaction to something you’ve said online;

  • Solipsistic introjection – the sense that one’s mind has become merged with the mind of the person with whom one is communicating online, leading to the creation of imagined ‘characters’ for these people and a consequent feeling that online communication is taking place in one’s head, again leading to disinhibition;

  • Dissociative imagination – a consciously or unconscious feeling that the imaginary characters “created” through solipsistic interjection exist in a‘make-believe dimension, separate and apart from the demands and responsibilities of the real world’ (Suler, 2004 p.323).

  • The minimization of authority (for people who do actually have some) due to the absence of visual cues such as dress, body language and environmental context, which can lead people to misbehave online.

Suler, J. (2004). The Online Disinhibition Effect. In CyberPsychology & Behaviour,
7(3), (pp. 321-326). Available from http://www.academia.edu/3658367/The_online_disinhibition_effect. [Accessed 10 September 2014]

Rethinking blogging

November 12th, 2014 by Graham Attwell

 

I used to post on this blog almost every day. Lately I haven’t been posting much. I am not worrying too much about it, but have been thinking about why.

I think it is largely to do with changes in my work. In the past, I was primarily a researcher, working on all manner of reports and projects, mostly in the field of elearning and knowledge development. My primary mode of work was desk research: in other words I read a lot. I can remember twenty years ago when I first moved to Bremen in Germany I used to travel about once every four months to the University of Surrey at Guildford (which was the easiest UK university to get to from Gatwick airport). I would spend thirty pounds on a photocopying card, spend an entire day in the university archives and travel back with photocopies of 60 or 70 research papers. I kept these for years before I realised I never looked at them. By 2000 of course, access to research was moving to the web. One of the big changes this heralded was the arrival of grey literature. Interestingly this term which was much used at the time, seems to have gone out of fashion, as it has slowly become accepted that web based materials of all kinds have at least some validity in the research process. So called grey literature gave access to a wider range of thinking and ideas than could be gained from official journal papers alone, although the debate over how to measure quality is far from resolved.

And to bring this up to date, the emergence of Open Educational Resources, Open Journals and specialist networks like the excellent ResearchGate, have increased the discoverability of research ideas and findings.

I used to enjoy the research work. And it was easy to blog. There would always be something in a paper, on a web site, in a network to comment on. I wrote a lot about Personal Learning Networks, a popular subject at the time, and through speaking at conferences and seminars got new ideas for more blog posts. But there were some frustrations to this work. Although we talked a lot about PLEs and the like, it was hard to see much evidence in practice. Our ideas were often just that: ideas which had at best limited evaluation and implementation in the field. Most frustratingly, few of the projects were

A tale of two conferences

September 19th, 2014 by Graham Attwell

In the first week of September, I attended two conferences – the Association for Learning Technology Conference (ALT-C) at Warwick University in the UK and the European Conference on Educational Research (ECER) hosted by Porto University in Portugal.

I guess there were some 500 people at ALT-C. Most seemed to be juggling two devices online at most times. And there were literally thousands of tweets using the Alt-C hashtag. the ECER conference was mush bigger with over 2600 registered delegates. I didn’t see too many onine. And there were very few tweets using the ECER hashtag. It was suggested to me this was because a singly hashtag is too broad to encompass the woide range of topics covered in ECER’s different networks. But I don;t think that was the reason. Although for those of us working with technology, online immersion has become a way of life, the culture of educational researchers has not yet embraced such an idea. Of course most – if not all 0 educational researchers are computer literate and of course teh internet is a key tool for accessing documents and for communication. But for most that is it.

A personal reality check

The challenges of open data: emerging technology to support learner journeys

May 8th, 2014 by Graham Attwell

Its several years since I have been to the AltC conference in the UK. And I have missed the chance to catch up with friends and colleagues workings in Technology Enhanced Learning in the UK. One reason I have not been going to the conference is it usually takes place in September, the high season of conferences, and there always seem to be clashes with something else. The main reason is simple though – the cost. With a conference fee of something over £500 excluding accommodation and travel, without a sponsor it is pretty hard to justify so much expenditure. The saddest thing about that cost is I suspect it excludes many young and emerging researchers, unable to meet the fee from their own pocket and with institutions increasingly limiting conference expenditure. My daughter tells me that, albeit in a different field, her university provides her conferences fees of just £500 a year!

Anyway, this year I am lucky enough to have a project to pay and have submitted, together with my colleagues working on the project the following abstract. You can find out more about the LMIforAll project at www.lmiforall.ork.uk

The challenges of open data: emerging technology to support learner journeys

Authors: Attwell, G., Barnes, S-A., Bimrose, J., Elferink, R., Rustemeier, P. & Wilson, R.

Abstract 

People make important decisions about their participation in the labour market every year. This extends from pupils in schools, to students in Further and Higher education institutions and individuals at every stage of their career and learning journeys. Whether these individuals are in transition from education and/or training, in employment and wishing to up-skill, re-skill or change their career, or whether they are outside the labour market wishing to re-enter, high quality and impartial labour market information (LMI) is crucial to effective career decision-making. LMI is at the heart of UK Government reforms of careers service provision.

Linking and opening up careers focussed LMI to optimise access to, and use of, core national data sources is one approach to improving that provision as well as supporting the Open Data policy agenda (see HM Government, 2012). Careers focussed LMI can be used to support people make better decisions about learning and work and improve the efficiency of labour markets by helping match supply with demand, and helping institutions in planning future course provision. A major project, funded by the UK Commission for Employment and Skills, is underway led by a team of data experts at the Institute for Employment Research (University of Warwick) with developers and technologists from Pontydysgu and Raycom designing, developing and delivering a careers LMI webportal, known as LMI for All.

The presentation will focus on the challenge of collaborating and collecting evidence at scale between institutions and the social and technological design and development of the database. The database is accessed through an open API, which will be explored during the presentation. Through open competition developers, including students in FE, have been encouraged to develop their own applications based on the data. Early adopters and developers have developed targeted applications and websites that present LMI in a more engaging way, which are targeted at specific audiences with contrasting needs. The web portal is innovative, as it seeks to link and open up careers focused LMI with the intention of optimising access to, and use of, core national data sources that can be used to support individuals make better decisions about learning and work. It has already won an award from the Open Data Institute. The presentation will highlight some of the big data and technological challenges the project has addressed. It will also look at how to organise collaboration between institutions and organisations in sharing data to provide new services in education and training. Targeted participants include developers and stakeholders from a range of educational and learning settings. The session will be interactive with participants able to test out the API, provide feedback and view applications.

Reference

HM Government (2012). Open Data White Paper: Unleashing the Potential. Norwich: TSO.

Storytelling with cartoons

February 13th, 2014 by Angela Rees

Always on the lookout for practical ways to use technology in the classroom, Pontydysgu were scoping out new ideas at Bett 2014.

We liked the new Lego storytelling kit. One set gives you a tray of Lego bits, there are minifigs, cats, frogs, brooms, Christmas trees and more.  You also get a book of lesson plans and ides and the accompanying software. There’s also a spinner to help choose a genre or character for storytelling inspiration.  The idea is that children work in groups to tell a story, each group has a kit with enough lego bits to recreate the same scene 5 times only each one is slightly different as their stories progress.  They then take photos of their scenes and upload them to a computer where they can drag and drop the photos into a comic strip style template, add backgrounds and captions and print their story.

The software is nice and simple to use, the lego kit has been carefully selected for optimum storyline coverage and it has the lego brand – guaranteed to spark some interest in even the most reluctant of storytellers.

Now, here at Pontydysgu we like a good idea, but what we like even more is a free idea.  So in the tradition of those catwalk-fashion at highstreet-prices magazine articles I bring you “BETT on a budget”

 

To create your own comic strip you will need;

A collection of small-world-play or dolls house characters and accessories.

A camera/ webcam/ cameraphone with the ability to transfer your photos to a computer.

Internet access.

An app or web based tool for comic strip creation using photographs.

Here are some I’ve been trying out this week;

Web based

Toondoo – Free- You need to create account but it is easy to do. Upload photos, edit, cut shapes out and save, then go to  cartoon creator, choose comic strip layout and you can put your own images into a cartoon, choose layout template, drag and drop backgrounds and cliparts, callouts and thought bubbles to create a story.

Downloads

Lego Storystarter software – for creating comics, and other styles Newspaper, old manuscript £107.99 inc VAT (the whole kit based on a class of 30 is £779.99 in VAT)

Comic Life – Cost £11.99 for a single user license or £1,049 for a site license.

Apps for iOS/Android

Comic touch – Free – From the creators of comic life this App cartoonises one photo at a time with no comic strip mode so you would have to print them and reassemble into a comic strip or download the pictures after editing and then use a different tool to put your story together.

 

 

Protected: test del me

August 31st, 2012 by Graham Attwell

This content is password protected. To view it please enter your password below:

  • Search Pontydysgu.org

    News Bites

    Innovation is male dominated?

    Times Higher Education reports that in the UK only one in 10 university spin-out companies has a female founder, analysis suggests. And these companies are much less likely to attract investment too, raising concerns that innovation is becoming too male-dominated.


    Open Educational Resources

    BYU researcher John Hilton has published a new study on OER, student efficacy, and user perceptions – a synthesis of research published between 2015 and 2018. Looking at sixteen efficacy and twenty perception studies involving over 120,000 students or faculty, the study’s results suggest that students achieve the same or better learning outcomes when using OER while saving a significant amount of money, and that the majority of faculty and students who’ve used OER had a positive experience and would do so again.


    Digital Literacy

    A National Survey fin Wales in 2017-18 showed that 15% of adults (aged 16 and over) in Wales do not regularly use the internet. However, this figure is much higher (26%) amongst people with a limiting long-standing illness, disability or infirmity.

    A new Welsh Government programme has been launched which will work with organisations across Wales, in order to help people increase their confidence using digital technology, with the aim of helping them improve and manage their health and well-being.

    Digital Communities Wales: Digital Confidence, Health and Well-being, follows on from the initial Digital Communities Wales (DCW) programme which enabled 62,500 people to reap the benefits of going online in the last two years.

    See here for more information


    Zero Hours Contracts

    Figures from the UK Higher Education Statistics Agency show that in total almost 11,500 people – both academics and support staff – working in universities on a standard basis were on a zero-hours contract in 2017-18, out of a total staff head count of about 430,000, reports the Times Higher Education.  Zero-hours contract means the employer is not obliged to provide any minimum working hours

    Separate figures that only look at the number of people who are employed on “atypical” academic contracts (such as people working on projects) show that 23 per cent of them, or just over 16,000, had a zero-hours contract.


    Other Pontydysgu Spaces

    • Pontydysgu on the Web

      pbwiki
      Our Wikispace for teaching and learning
      Sounds of the Bazaar Radio LIVE
      Join our Sounds of the Bazaar Facebook goup. Just click on the logo above.

      We will be at Online Educa Berlin 2015. See the info above. The stream URL to play in your application is Stream URL or go to our new stream webpage here SoB Stream Page.

  • Twitter

  • Sounds of the Bazaar AudioBoo

  • Recent Posts

  • Archives

  • Meta

  • Upcoming Events

      There are no events.
  • Categories