Archive for the ‘AI’ Category

Careers identities in the Lockdown

March 30th, 2020 by Graham Attwell

Graham Attwell will be speaking at an online webinar – LiveCareerChat@Lockdown on 6 April. The webinar, organised by DMH Associates will focus on the future challenges for careers identities and careers advice and guidance

Deirdre Hughes says “During these turbulent times, we all have an opportunity for reflection, sharing ideas and offering practical advice on how best to manage career identity and changing work practices. This webinar is designed to bring people together and to listen and/or share experiences of careers support mechanisms at a time of crisis. ”

Graham Attwell will talk about the changing international labour markets and the challenges of new technologies, including AI and automation.

The webinar takes at 1630 – 1730 CEST on Monday 6 April and is free. You can register at https://dmhassociates.easywebinar.live/event-registration-3

The future of work, Artificial Intelligence and automation: Innovation and the Dual Vocational Education and training system

March 2nd, 2020 by Graham Attwell


I am speaking at a seminar on Vocational Education and Training’s Role in Business Innovation at the Ramon Areces Foundation in Madrid tomorrow. The title of my presentation is ‘The future of work, Artificial Intelligence and automation: Innovation and the Dual Vocational Education and training system in Valencia’ which is really much too long for a title and I have much too much to say for my allotted 20 minutes.

Any way, this is what I told them I was going to talk about:
The Presentation looks at the future of work, linked to the challenges of Artificial Intelligence, Automation and the new Green Economy. It considers and discusses the various predictions on future jobs and occupations from bodies including CEDEFOP, OECD and the World Bank. It concludes that although one jobs will be v=craeted and some occupations be displaced by new technologies. the greatest impact will be in terms of the tasks performed within jobs. It further discusses future skills needs, including the need for higher level cognitive competences as well as the demand for so called lower skilled work in services and caring professions.
It considers the significance of these changes for vocational education and training, including the need for new curricula, and increased provision of lifelong learning and retraining for those affected by the changing labour market.
Artificial Intelligence may also play an important role in the organisation and delivery of vocational education and training. This includes the use of technologies such as machine learning and Natural Language processing for Learner engagement, recruitment and support, Learning Analytics and ‘nudge learning’ through a Learning Record Store, and  the creation and delivery of learning content. It provides examples such as the use of Chatbots in vocation education and training schools and colleges. It is suggested that the use of AI technologies can allow a move from summary assessment to formative assessment. The use of these technologies will reduce the administrative load for teachers and trainers and allow them to focus on coaching, particularly benefiting those at the top and lower end of the student cohort.
To benefit from this potential will requite new and enhanced continuing professional development for teachers and trainers. Finally the presentation considers what this signifies for the future of the Dual VET system in Spain, looking at findings from both European projects and research undertaken into Dual training in Valencia.
And I will report back here after the event.

AI and the future of Education

February 20th, 2020 by Graham Attwell
abacus, calculus, classroom

Pexels (CC0), Pixabay

More as promised in my last post from the interviews we are doing on AI and Education.

One implication of AI and automation is changes in curriculum content and pedagogy. I talked with Chris Percy about this.

Chris pointed out that for school leavers qualification at GCSE level maths and English are a requirement even for vocational students and he thinks this is unlikely to change. However he thinks that programmes in these subjects will move to  –to adaptive personal learning environments.

Furthermore he says the flipped classroom model will change the role of teachers. “It has proved impossible to improve the staff student ration – general courses have 20 – 40 students or 7 to 10 on niche courses. This needs 3 / 4 way differentiation. Teachers are more conductors than coaches.” However Chris added a caveat – research suggests the the flipped classroom re model has limits. “It only really works for those who want to learn. It is possible that adults know what they want to learn but lack the motivation for self learning. Peers and teachers are important for extrinsic motivation. Disengaged teenagers are frequently not sufficiently motivated. Self taught learning even wth a mentor will only go so far. ” Cris also says that learning has a social element and questions whether avatars can really replace the social role played by teachers. As he points out, generalized AI is still out of reach.  “Chatbots cannot replace teachers at the front of a classroom. Students will have no respect for a chatbot. Teachers are skilled in developing engagement. Chatbots are good for students with a base level of motivation.”

The issue of motivation has come up in most of the interviews I have undertaken as part of the AI and Vocational Education and Learning project. I will talk more about this in a short podcast this weekend talking about my experiences as a language learner using the popular and heavily gamified DuoLingo application.

 

AI, automation, the future of work and vocational education and training

February 17th, 2020 by Graham Attwell

Regular readers will know I am working on a project on AI and Vocational Education and Training (VET). We are looking both at the impact of AI and automation on work and occupations and the use of AI for teaching and learning. Later in the year we will be organizing a MOOC around this: at the moment we are undertaking interviews with teachers, trainers , managers and developers (among others) in Italy, Greece, Lithuania, Germany and the UK.

The interviews are loosely structured around five questions:

  • What influence do you think AI and automation is going to have on occupations that you or your institution provide training for?
  • Do you think AI is going to effect approaches to teaching and learning? If so could you tell us how?
  • Have you or your institution any projects based around AI. If so could you tell us about them?
  • How can curricula be updated quickly enough to respond to the introduction of AI?
  • Do you think AI and automation will result in less jobs in the future or will it generate new jobs? If so what do you think the content of those jobs will be?

Of course it depends on the work role and interests of the interviewee as to which questions are most discussed. And rather than an interview, with the people I have talked with it tends to be more of a discussion.

while the outcomes of this work will be published in a report later this spring, I will publish here some of the issues which have been come up.

Last week I talked with Chris Percy, who describes himself as a Business strategy consultant and economist.

Chris sees AI and technology as driving an increasing pace of change in how work is done. He says the model for vocational education is to attend college to get skills and enter a trade for ten or twenty years – albeit with refreshers and licenses to update knowledge. This, he says, has been the model for the last 50 years but it may not hold if knowledge is so fast changing. He is not an AI evangelist and thinks changes feed through more slowly. With this change new models for vocational education and training are needed, although what that model might be is open. It could be e to spend one year learning in every seven years or one day a week for three months every year.

The main issue for VET is not how to apply AI but how we structure jobs, Lifelong Learning and pedagogy.

One problem, at least in the UK. has been a reduction in the provision of Life Long Learning has gone down in the UK. In this he sees a disconnect between policy and the needs of the economy.  But it may also be that if change is slower than in the discourse it just has just not impacted yet. Tasks within a job are changing rather than jobs as a whole. We need to update knowledge  for practices we do not yet have. A third possible explanation is that although there are benefits from new technologies and work processes the benefits from learning are not important enough for providing new skills.

New ways of learning are needed – a responsive learning based on AI could help here – but there is not enough demand to overcome inertia. The underpinning technologies are there but have not yet translated into schools to benefit retraining.

Relatively few jobs will disappear in their entirety – but a lot of logistics, front of store jobs, restaurants etc. will be transformed. It could be there will be a lower tier of services based on AI and automation and a higher tier with human provision. Regulators can inhibit the pace of change – which is uneven in different countries and cities e.g. Self driving cars.

In most of the rest of the economy people will change as tasks change. For example the use of digital search in the legal industry  has been done by students, interns and paralegals because someone has to do it – now with AI supporting due diligence students can progress faster to more interesting parts of the work. Due diligence is now AI enabled.

Chris thinks that although AI and automation will impact on jobs, global economic developments will still be a bigger influence on the future of work.

More from the interviews later this week. In the meantime if you would like to contribute to the research – or just would like to contribute your ideas – please et in touch.

 

 

Changing the role of Assessment

February 11th, 2020 by Graham Attwell

Front cover of future of assessment reportFormative assessment should provide a key role in all education and particularly in vocational education and training. Formative assessment can give vital feedback to learners and guidance in the next steps of their learning journey. It can also help teachers in knowing what is effective and what is not, where the gaps are and help in planning learning interventions.

Yet all too often it does not. Assessment is all too often seen at best as something to overcome and at worst as a stress inducing nightmare. With new regulations in England requiring students in further education to pass tests in English and Mathmatics, students are condemned to endless retaking the same exams regardless of achievement in vocational subjects.

For all these reasons a new report published by Jisc today is very welcome.

Jisc say:

Existing and emerging technologies are starting to play a role in changing assessment and could help address these issues, both today and looking further ahead into the future, to make assessment smarter, faster, fairer and more effective.

The report sets five targets for the next five years to progress assessment towards being more authentic, accessible, appropriately automated, continuous and secure.

  • AuthenticAssessments designed to prepare students for what they do next, using technology they will use in their careers

  • AccessibleAssessments designed with an accessibility-first principle

  • Appropriately automatedA balance found of automated and human marking to deliver maximum benefit to students

  • ContinuousAssessment data used to explore opportunities for continuous assessment to improve the learning experience

  • SecureAuthoring detection and biometric authentication adopted for identification and remote proctoring

The report: ‘The future of assessment: five principles, five targets for 2025’ can be downloaded from the Jisc website.

 

Good jobs, bad jobs, skills and gender

February 3rd, 2020 by Graham Attwell

I have written before about the issues of interpreting sense making from Labour Market Data and the difference between Labour Market Information and labour Market Intelligence.

This is exposed dramatically in the article in Social Europe by German Bender entitled ‘The myth of job polarisation may fuel populism’. As German explains “It has become conventional wisdom since the turn of the century that labour markets are rapidly becoming polarised in many western countries. The share of medium-skilled jobs is said to be shrinking, while low- and high-skilled jobs are growing in proportion.” But as German points out: “In a research report published last May by the Stockholm-based think tank Arena Idé, Michael Tåhlin, professor of sociology at the Swedish Institute for Social Research, found no job polarisation—rather, a continuous upgrading of the labour market.”

German goes on to explain:

The main reason is that the research, as is to be expected from studies rooted in economics, has used wages as a proxy for skills: low-paying jobs are taken to be low-skilled jobs and so on. But there are direct ways of measuring skill demands in jobs, and Arena Idé’s report is based on a measure commonly used in sociology—educational requirements as classified by the International Labour Organization’s ISCO (International Standard Classification of Occupations) scheme. Using this methodology to analyse the change in skill composition yields strikingly different results for the middle of the skill distribution.

The study found that while jobs relatively low skill demands but relatively high wages—such as factory and warehouse workers, postal staff and truck drivers—have diminished, others with the same or slightly higher skill demands but lower wages—nursing assistants, personal-care workers, cooks and kindergarten teachers—have increased.

The reason is that the former jobs are male dominated whilst the jobs which have grown have a majority of female workers. Research in most countries has shown that women (and jobs in which women are the majority) are lower paid than jobs for men, regardless of skills levels.

“Put simply”, says German: “wages are a problematic way to measure skills, since they clearly reflect the discrimination toward women prevalent in most, if not all, labour markets across the world.”

A further review of two British studies from 2012 and 2013, showed a change in the composition, but not the volume, of intermediate-level jobs. “Perhaps the most important conclusion”, German says “was that ‘the evidence shows that intermediate-level jobs will remain, though they are changing in nature’.”

The implications of this interpretation of the data are profound. If lower and medium skilled jobs are declining there is little incentive to invest in vocational education and training for those occupations. Furthermore, young people may be put off entering such careers and similarly careers advisers may further mislead school leavers.

There has been a trend in many European countries towards higher level apprenticieships, rather than providing training with the skills need to enter such medium skilled jobs. But even a focus on skills, rather than wages, may also be misleading. It is interesting that jobs such as social care and teaching appear more resistant to automation and job replacement from technologies such as Artificial Intelligence. But those who are arguing that we should be teaching so called soft skills such as team building, empathy and communication are talking about the very skills increasingly demanded in the female dominated low and middle skilled occupations. It may be that we need not ony to relook at how we move away from wages as a proxy for skills, but also look at how we measure skills.

German references research by Daniel Oesch and Giorgio Piccitto, who studied occupational change in Germany, Spain, Sweden and the UK from 1992 to 2015, characterising good and bad jobs according to four alternative indicators: earnings, education, prestige and job satisfaction.

They concluded that occupations with high job quality showed by far the strongest job growth, whereas occupations with low job quality showed weak growth regardless of indicator used.

 

 

 

 

 

 

 

 

 

 

 

Does AI mean we no longer need subject knowledge?

January 15th, 2020 by Graham Attwell

I am a little bemused by the approach of many of those writing about Artificial Intelligence in education to knowledge. The recently released Open University Innovation Report, Innovating Pedagogy, is typical in that respect.

“Helping students learn how to live effectively in a world increasingly impacted by AI also requires a pedagogy”, they say, “that, rather than focusing on what computers are good at (e.g. knowledge acquisition), puts more emphasis on the skills that make humans uniquely human (e.g. critical thinking, communication, collaboration and creativity) – skills in which computers remain weak.”

I have nothing against critical thinking, collaboration or creativity, although I think these are hard subjects to teach. But I find it curious that knowledge is being downplayed on the grounds that computers are good at it. Books have become very good at knowledge over the years but it doesn’t mean that humans have abandoned it to the books. What is striking though is the failure to distinguish between abstracted and applied knowledge. Computers are very good at producing (and using) information and data. But they are not nearly as good at applying that knowledge in real world interactions. Computers (in the form of robots) will struggle to open a door. Computers may know all about the latest hair styles but I very much doubt that we will be trusting them to cut our hair in the near future. But of course, the skills I am talking about here are vocational skills – not the skills that universities are used to teaching.

As opposed to the emergent Anglo Saxon discourse around “the skills that make humans uniquely human” in Germany the focus on Industry 4.0 is leading to an alternative idea. They are seeing AI and automation as requiring new and higher levels of vocational knowledge and skills in areas like, for example, the preventative maintenance of automated production machinery. This seems to me to be a far more promising area of development. The problem I suspect for education researchers in the UK is that they have to start thinking about education outside the sometimes rarified world of the university.

Equally I do not agree with the reports assertion that most AI applications for education are student-facing and are designed to replace some existing teacher tasks. “If this continues”, they say “while in the short run it might relieve some teacher burdens, it will inevitably lead to teachers becoming side-lined or deprofessionalised. In this possible AI-driven future, teachers will only be in classrooms to facilitate the AI to do the ‘actual’ teaching.”

The reality is that there are an increasing number of AI applications which assist tecahers rather than replace them – and that allow teachers to get on with their real job of teaching and supporting learning, rather than undertaking an onerous workload of admin. There is no evidence of the inevitability of teachers being either sidelined or deprofessionaised. And those experiments from Silicon Valley trying to ‘disrupy’ education by a move to purely online and algorithm driven learning have generally been a big failure.

 

 

Artificial, Intelligence, ethics and education

January 2nd, 2020 by Graham Attwell

I guess we are going to be hearing a lot about AI in education in the next year. As regular readers will know, I am working on a European Commission Erasmus Plus project on Artificial Intelligence and Vocational Education and Training. One subject which is constantly appearing is the issue of ethics. Apart from the UK universities requirements for ethical approval of research projects (more about this in a future post), the issue of ethics rarely appears in education as a focus for debate. Yet it is all over the discussion of AI and how we can or should use AI in education.

There is an interesting and (long) blog post – ‘The Invention of “Ethical AI“‘ recently published by Rodrigo Ochigame on the Intercept web site.

Orchigame worked as a graduate student researcher in the former director of the MIT Media Lab, Joichi Ito’s group on AI ethics at the Media Lab. He left in August last year , immediately after Ito published his initial “apology” regarding his ties to Epstein, in which he acknowledged accepting money from the disgraced financier both for the Media Lab and for Ito’s outside venture funds.

The quotes below provide an outline of his argument although for anyone interested in this field the article merits a full read. the

The emergence of this field is a recent phenomenon, as past AI researchers had been largely uninterested in the study of ethics

The discourse of “ethical AI,” championed substantially by Ito, was aligned strategically with a Silicon Valley effort seeking to avoid legally enforceable restrictions of controversial technologies.

This included working on

the U.S. Department of Defense’s “AI Ethics Principles” for warfare, which embraced “permissibly biased” algorithms and which avoided using the word “fairness” because the Pentagon believes “that fights should not be fair.”

corporations have tried to shift the discussion to focus on voluntary “ethical principles,” “responsible practices,” and technical adjustments or “safeguards” framed in terms of “bias” and “fairness” (e.g., requiring or encouraging police to adopt “unbiased” or “fair” facial recognition).

it is helpful to distinguish between three kinds of regulatory possibilities for a given technology: (1) no legal regulation at all, leaving “ethical principles” and “responsible practices” as merely voluntary; (2) moderate legal regulation encouraging or requiring technical adjustments that do not conflict significantly with profits; or (3) restrictive legal regulation curbing or banning deployment of the technology. Unsurprisingly, the tech industry tends to support the first two and oppose the last. The corporate-sponsored discourse of “ethical AI” enables precisely this position.

the corporate lobby’s effort to shape academic research was extremely successful. There is now an enormous amount of work under the rubric of “AI ethics.” To be fair, some of the research is useful and nuanced, especially in the humanities and social sciences. But the majority of well-funded work on “ethical AI” is aligned with the tech lobby’s agenda: to voluntarily or moderately adjust, rather than legally restrict, the deployment of controversial technologies.

I am not opposed to the emphasis being placed on ethics in AI and education and the debate and practice son Learning Analytics show the need to think clearly about how we use technology. But we have to be careful that we firstly do not just end up paying lip service to ethics and secondly that academic research does not become a cover for teh practices of the Ed tech industry. Moreover, I think we need a clearer understanding of just what we mean when we talk about ethics in the educational context. For me the two biggest ethical issues are the failure of provide education for all and the gross inequalities in educational provision based on things like class and gender.

 

Readings on AI and Education

November 18th, 2019 by Graham Attwell

In an early activity in our new project on Artificial Intelligence in Vocational Education and Training, we are undertaking a literature review. Although there seems to be little about AI and VET, the issue of AI in education is thsi years hot trend. Of course there seems to be more talk than actual practice. Any way, here is a quick summary (just notes really) of things I stumbled on last week.

Perhaps most interesting was an online webinar organised by the European Distance Education Network (EDEN) as part of European Distance Learning Week.  According to the online platform there were 49 of us present and four presentations. Sadly the recording is not yet available but I will link to it once it is online. What was most interesting was that almost everyone who spoke, and I recognised quite a few prominent researchers in the contributions, were pretty much opposed to AI. Too dangerous, no benefit, just hype, developers with no idea about learning etc. Really only one speaker, Alexandra Cristea from Durham University could see potential.

I found teh follwing publiscation by her. Demographic Indicators Influencing Learning Activities in MOOCs: Learning Analytics of Future Learn Courses (PDF) by Alexandra I. Cristea  and Lei Shi from the University of from Liverpool University  looks at pre-course survey data and online learner interaction data collected from two MOOCs, delivered by the University of Warwick,in 2015, 2016,and 2017. The data is used  to explore how learner demographic indicators may influence learner activities.Recommendations for educational information system development and instructional design, especially when a course attracts a diverse group of learners, are provided.

Meanwhile in the UK, NESTA are continuing to promote AI. However, they too emphasis ethical issues with the use of the technology. In ‘Educ-AI-tion rebooted? Exploring the future of artificial intelligence in schools and colleges’ they say

Although challenges for the ethical and responsible use of artificial intelligence and the sharing of data are common to many sectors, schools and colleges present a distinct combination of properties and considerations. The sharing of data needs to be governed in a manner that realises benefit for the public, and AIEd must be used ethically and responsibly.

AIEd’s potential and risks is reflected in the views of parents. 61% of parents anticipate that AI will be fairly or very important to the classroom of the near future. However, many are fairly or very concerned about consequences of determinism (77%), accountability (77%) and privacy and security (73%).

Finally, I had a look at the X5GON project website. X5GON is a large scale European research programme project, bringing togther a number of leading European Universities. It appears to be developing AI driven tools. particarrly focused on Open educational Resources. The project website says:

This new AI-driven platform will deliver OER content from everywhere, for the students’ need at the right time and place. This learning and development solution will use the following solutions to accomplish this goal:

  • Aggregation: It will gather relevant content in one place, from the projects case studies as well as external providers and other preferred resources.
  • Curation: AI and machine learning will be key to curate relevant and contextual content and external students at the right time and point of need.
  • Personalization: It will make increasingly personalized recommendations for learning content to suit students’ needs, based on the analysis of relevant factors.
  • Creation: Large, small and medium-sized universities have tacit knowledge that can be unlocked and re-used. This approach will allow any organization to release and build their own content libraries quickly and conveniently to share with the world and vice versa.

I’ll keep writing up my findings, in the form of notes on this site. And if anyone has any recommendations of what else I should be reading please add in the comments below.

AI needs diversity

November 6th, 2019 by Graham Attwell

As promised another AI post. One of the issues we are looking at in our project on AI and education is that of ethics. It seems to me that the tech companies have set up all kinds of ethical frameworks but I am not sure about the ethics! they seem to be trying to allay fears that the robots will take over: this is not a fear I share. I am far ore worried about what humans will do with AI. In that respect I very much like this TEDxWarwick talk by Kriti Sharma.

She says AI algorithms make important decisions about you all the time — like how much you should pay for car insurance or whether or not you get that job interview. But what happens when these machines are built with human bias coded into their systems? Kriti Sharma explores how the lack of diversity in tech is creeping into our AI, offering three ways we can start making more ethical algorithms.I wonder too, how much the lack of diversity in educational technology is holding back opportunities for learning

  • Search Pontydysgu.org

    Social Media




    News Bites

    Erasmus+

    The European Commission has published an annual report of the Erasmus+ programme in 2018. During that time the programme funded more than 23,500 projects and supported the mobility of over 850,00 students, of which 28,247 were involved in UK higher education projects, though only one third of these were UK students studying abroad while the remainder were EU students studying in the UK. The UK also sent 3,439 HE staff to teach or train abroad and received 4,970 staff from elsewhere in the EU.


    Skills Gaps

    A new report by the Learning and Work Institute for the Local Government Association (LGA) finds that by 2030 there could be a deficit of 2.5 million highly-skilled workers. The report, Local Skills Deficits and Spare Capacity, models potential skills gaps in eight English localities, and forecasts an oversupply of low- and intermediate -skilled workers by 2030. The LGA is calling on the government to devolve the various national skills, retraining and employment schemes to local areas. (via WONKHE)


    Innovation is male dominated?

    Times Higher Education reports that in the UK only one in 10 university spin-out companies has a female founder, analysis suggests. And these companies are much less likely to attract investment too, raising concerns that innovation is becoming too male-dominated.


    Open Educational Resources

    BYU researcher John Hilton has published a new study on OER, student efficacy, and user perceptions – a synthesis of research published between 2015 and 2018. Looking at sixteen efficacy and twenty perception studies involving over 120,000 students or faculty, the study’s results suggest that students achieve the same or better learning outcomes when using OER while saving a significant amount of money, and that the majority of faculty and students who’ve used OER had a positive experience and would do so again.


    Other Pontydysgu Spaces

    • Pontydysgu on the Web

      pbwiki
      Our Wikispace for teaching and learning
      Sounds of the Bazaar Radio LIVE
      Join our Sounds of the Bazaar Facebook goup. Just click on the logo above.

      We will be at Online Educa Berlin 2015. See the info above. The stream URL to play in your application is Stream URL or go to our new stream webpage here SoB Stream Page.

  • Twitter

    There is so much innovation and creativity that teachers and students are coming out with twitter.com/SalfordVPAM/st…

    About 2 hours ago from Graham Attwell's Twitter via Tweetbot for Mac

  • Recent Posts

  • Archives

  • Meta

  • Categories