Archive for the ‘AI’ Category

Understanding the changing Covid-19 labour market

August 26th, 2020 by Graham Attwell
looking for a job, work, silhouettes

geralt (CC0), Pixabay

Yesterday I attended a webinar organized by the UK Association of Colleges in their Labour Market Observatory Series. The subject of the webinar was Using Job Posting Analytics to understand the changing Covid-19 labour market.

Understanding labour markets is a hard job at the best of time and the Covid-19 pandemic and the resulting lockdown have disrupted the economy with unprecedented speed and scale. As Duncan Brown, Senior Economist from Emsi, explained, raditional labour market statistics take time to emerge, especially to understand what’s going at regional and local level, and real-time indicators become all-important. Duncan Brown, talked through what their Job Posting Analytics – derived from collecting (or scraping) around 200,000 new, unique job postings from job boards across the internet every week — can tell us about where and how the labour market is changing and what to look for as we move into the recovery.

First though he explained how the data is collected using bots before being cleaned and duplication removed, prior to using algorithms to analyse the data. He pointed out that there are limitations to the data derived from job adverts but compared to the time taken for official labour market data to emerge, for instance through the UK National Office of Statistics Labour Force Survey (LFS)job posting analytics can provide an almost real time snapshot view of the labour market, and is easily projected at a local level.

My notes on the webinar are somewhat patchy but here are a few take home points, particularly from a question and answer session that followed Duncan Brown’s presentation.

There was a huge fall in online job adverts in April and May with the lockdown – as high as 80 per cent in some sectors and localities. Since then there has been a steady recovery in the number of jobs being advertised online but this recovery is uneven between different sectors and different cities and regions.

As examples offers of employment in the food and hospitality. Industries remain dire and aerospace is also still badly hit. On the other hand, job advert volumes in manufacturing have substantially recovered and, perhaps understandably there is an increase in jobs adverts in health care.

There is considerable differences as to how far the volume of job adverts has recovered (or otherwise) in different cities. In general, it would appear that those cities with the largest percentage of office work and of commuters are doing worse: London in particular.

One area of the labour market that Emsi is focusing on is skills demand. They have developed their own skills directory, which Duncan Brown said, now contains over 3000 skills and are running a project funded by Nesta to see if these skills can be clustered around different occupations. Yet despite the so-called pivot to skills, he said there few signs that employers were. Moving away from the traditional emphasis on qualifications. However, qualification demands often did not appear in job adverts but rather tended to be assumed by both employers and job applicants. For instance, someone applying for a job as an accountant would presume that they needed formal qualifications.

Although there have long been predictions over the impact of automation and AI on employment, Duncan Brown said there was little evidence of this. His feeling is that, at least in the UK, the existence of relatively cheap labour in many sectors where it would be relatively easy to automate tasks, was a disincentive to the necessary investment. He thought that labour costs may have been kept down by immigration. He pointed to car washes as an example of an area where far from advancing automation had actually gone backwards.

The slides from the presentation and a recording of the webinar will be available from 27 August on the Association of Colleges website.

 

AI and Algorithms: the UK examination debacle

August 20th, 2020 by Graham Attwell

This article was originally published on the Taccle AI web site.

There’s a lot to think about in the ongoing debacle over exam results in the UK. A quick update for those who have not been following the story. Examinations for young people, including the O level and A level academic exams and the more vocationally oriented Btec were cancelled this year due to the Covid19 pandemic. Instead teachers were asked to firstly provide an estimated grade for each student in each subject and secondly to rank order the students in their school.

These results were sent to a government central agency, the Office of Qualifications known as Ofqual. But instead of awarding qualifications to students based on the teachers’ predicted grades, it was decided by Ofqual, seemingly in consultation or more probably under pressure, by the government to use an algorithm to calculate grades. This was basically based on the previous results achieved by the school in each subject, with adjustments made for small class cohorts and according to the rankings.

The results from the A levels were released last week. They showed massive irregularities at an individual level with some students seemingly downgraded from predicted A* *the highest grade, to a C or D. Analysis also showed that those students from expensive private schools tended to do better than expected, whilst students from public sector schools in working class areas did proportionately worse than predicted. In other words, the algorithm was biased.

As soon as the A level results were announced there were protest from teachers, schools and students. Yet the government stuck to its position, saying there would be no changes. The Prime Minister Boris Johnson said “Let’s be in no doubt about it, the exam results wed have got today are robust, they’re good, they’re dependable for employers”. However, concern quickly grew about the potential of numerous appeals and indeed at the time it would take teachers preparing such appeals. Meanwhile the Scottish government (which is autonomous in education policy) announced that they would revert of the teachers’ predicted grades. In England while the government stood firm demonstrations by school students broke out in most cities. By the weekend it was clear that something had to change and on Monday the UK government, responsible for exams in England and Wales, announced that they too would respect teacher predicted grades.

The political fallout goes on. The government is trying to shift the blame to Ofqual, despite clear evidence that they knew what was happening.  Meanwhile some of the universities who are reliant on the grades for the decision over who to offer places to, are massively oversubscribed as a result of the upgrades.

So, what does this mean for the use of AI in education. One answer maybe that there needs to be careful thinking about how data is collected and used. As one newspaper columnist put it as the weekend “Shit in, shit out”. Essentially the data used was from the exam results of students at a collective school level in previous years. This has little or no relevance as to how an individual student might perform this year. In fact, the algorithm was designed with the purpose not of awarding an appropriate grade for a student to reflect their learning and work, but to prevent what is known as grade inflation. Grade inflation is increasing numbers of students getting higher grades each year. The government sees this as a major problem.

But this in turn has sparked off a major debate, with suspicions that the government does in fact support a bias in results, aiming to empower the elite to attend university with the rest heading for a second class vocational education and training provision. It has also been pointed out that the Prime Ministers senior advisor, Dominic Cummings, has in the past written articles appearing to suggest that upper class students are more inherently intelligent than those from the working class.

The algorithm, although blunt in terms of impact, merely replicated processes that have been followed for many years (and certainly preceding big data). Many years ago, I worked as a project officer for the Wales Joint Education Committee (WJEC). The WJEC was the examination board for Wales. At that time there were quite a number of recognized examination boards, although since then the number has been reduced by mergers. I was good friends with a senior manager in the exam board. And he told me that every year, about a week before the results were announced, each exam board shared their results, including the number of students to be awarded each grade. The results were then adjusted to fit the figures that the boards had agreed to award in that year.

And this gets to the heart of the problems with the UK assessment system. Of course, one issue is the ridiculous importance placed on formal examinations. But it also reflects the approach to assessment. Basically, there are three assessment systems. Criteria based assessment means that any students achieving a set criterion are awarded accordingly. Ipsative based assessment, assesses achievement based on the individuals own previous performance. But in the case of UK national exams the system followed is norm referenced, which means that a norm is set for passing and for grading. This is fundamentally unfair, in that if the cohort for one year is high achieving the norm will be raised to ensure that the numbers achieving any particular grade meet the desired target. The algorithm applied by Ofqual weas essentially designed to ensure results complied with the norm, regardless of individual attainment. It has always been done this way, the difference this year was the blatant crudeness of the system.

So, there is a silver lining, despite the stress and distress caused for thousands of students. At last there is a focus on how the examination system works, or rather does not. And there is a focus on the class-based bias of the system which has always been there. However, it would be a shame if the experience prevents people from looking at the potential of AI, not for rigging examination results, but for supporting the introduction of formative assessment or students to support their learning.

If you are interested in understanding more about how the AI based algorithm worked there is an excellent analysis by Tom Haines in his blog post ‘A levels: the Model is not the Student‘.

 

What is Artificial Intelligence

July 27th, 2020 by Graham Attwell

What is artificial intelligence? from Nesta UK on Vimeo.

This video is from Nesta in the UK. Nesta say: ”

The choices we make now about how AI is steered and directed will shape the lives of future generations. Many institutions are focussed on developing the technology, but very few are attending to the human, social and public dimensions of AI in a serious way. This is where Nesta is focusing its work, bringing together research and practical programmes to explore the potential of AI as a force for social good – from research into global trends in AI, to experiments that combine human and machine intelligence.”

Digitalisation, Artificial Intelligence and Vocational Occupations and Skills

July 27th, 2020 by Graham Attwell

geralt (CC0), Pixabay

The Taccle AI project on Artificial Intelligence and Vocational Education and Training, has published a preprint  version of a paper which has been submitted of publication to the VET network of the European Research Association.

The paper, entitled  Digitalisation, Artificial Intelligence and Vocational Occupations and Skills: What are the needs for training Teachers and Trainers, seeks to explore the impact AI and automation have on vocational occupations and skills and to examine what that means for teachers and trainers in VET. It looks at how AI can be used to shape learning and teaching processes, through for example, digital assistants which support teachers. It also focuses on the transformative power of AI that promises profound changes in employment and work tasks. The paper is based on research being undertaken through the EU Erasmus+ Taccle AI project. It presents the results of an extensive literature review and of interviews with VET managers, teachers and AI experts in five countries. It asks whether machines will complement or replace humans in the workplace before going to look at developments in using AI for teaching and learning in VET. Finally, it proposes extensions to the EU DigiCompEdu Framework for training teachers and trainers in using technology.

The paper can be downloaded here.

Taccle AI project – Interviews

July 22nd, 2020 by Graham Attwell

As part  of the Taccle AI project, around the impact of AI on vocational education and training in Europe, we have undertaken interviews with managers, teachers, trainers and developers in five European countries (the report of the interviews, and of an accompanying literature review, will be published next week).  One of the interviews I made was with Aftab Hussein, the ILT manager at Bolton College in the north west  of Engand. Aftab describes himself on Twitter (@Aftab_Hussein) as “exploring the use of campus digital assistants and the computer assisted assessment of open-ended question.”

Ada, Bolton College’s campus digital assistant has been supporting student enquiries about college services and their studies since April 2017.In September 2020, the college is launching a new crowdsourcing project which seeks to teach Ada about subject topics. They are seeking the support of teachers to teach Ada about their subjects.

According to Aftab “Teachers will be able to set up questions that students typically ask about subject topics and they will have the opportunity to compose answers against each of these questions. No coding experience is required to set up questions and answers.Students of all ages will have access to a website where they will be able to select a subject chatbot and ask it questions. Ada will respond with answers that incorporate the use of text, images, links to resources and embedded videos.

The service will be free to use by teachers and students.”

If you are interested in supporting the project complete the online Google form.

AI and Young People

July 17th, 2020 by Graham Attwell

Last December, the Youth Department of the Council of Europe organised a seminar on Artificial Intelligence and its Impact on Young People. The aim of the seminar was to explore the issues, role and possible contributions of the youth sector in an effort to ensure that AI is responsibly used in democratic societies and that young people have a say about matters that concern their present and future. The seminar looked, among other things, into three dimensions of AI”

  • AI and democratic youth participation (including young people’s trust/interest in democracy)
  • AI and young people’s access to rights (including social rights)
  • AI and youth policy and youth work

According to the report of the seminar, the programme enabled the participants to put together their experience and knowledge in proposing answers to the following questions:

  • What are the impacts of on young people and how can young people benefit from it?
  • How can the youth sector make use of the capacities of to enhance the potential of youth work and youth policy provisions for the benefit of young people?
  • How to inform and “educate” young people about the potential benefits and risks of AI, notably in relation to young people’s human rights and democratic participation and the need to involve all young people in the process?
  • How does AI influence young people’s access to rights?
  • What should the youth sector of the Council of Europe, through the use of its various instruments and partners, do about AI in the future?

Not only is there a written report of the seminar but also an excellent illustrated report. Sadly it is not in a format that  can be embedded, but  it is well worth going to the Council of Europe’s web page on AI and scrolling to the bottom to see the report.

European Union, AI and data strategy

July 9th, 2020 by Graham Attwell
lens, colorful, background

geralt (CC0), Pixabay

is the rapporteur for the industry committe for European Parliament’s own-initiative  on data strategy and  a standing rapporteur on the World Trade Organization e-commerce negotiations in the European Parliament’s international trade committee.

Writing in Social Europe she says:

Building a human-centric data economy and human-centric artificial intelligence starts from the user. First, we need trust. We need to demystify the data economy and AI: people tend to avoid, resist or even fear developments they do not fully understand.

Education plays a crucial role in shaping this understanding and in making digitalisation inclusive. Although better services—such as services used remotely—make life easier also outside cities, the benefits of digitalisation have so far mostly accrued to an educated fragment of citizens in urban metropoles and one of the biggest obstacles to the digital shift is lack of awareness of new possibilities and skills.

Kampula-Natri draws attention to the Finnish-developed, free online course, ‘Elements of AI’. This started as a course for students in the University of Helsinki but has extended  its reach to over 1 per cent of Finnish citizens.

Kampula-Natri points out that in the Nordic countries, the majority of participants on the ‘Elements of AI’ course are female and in the rest of the world the proportion exceeds 40 per cent—more than three times as high as the average ratio of women working in the technology sector. She says that after the course had been running in Finland for a while, the number of women applying to study computer science in the University of Helsinki increased by 80 per cent.

Learning about surveillance

July 3rd, 2020 by Graham Attwell
eye, surveillance, privacy

GDJ (CC0), Pixabay

I found this on the Social Media Collective website. The Social Media Collective is a network of social science and humanistic researchers, part of the Microsoft Research labs in New England and New York.

Yesterday the Wayne County Prosecutor publicly apologized to the first American known to be wrongfully arrested by a facial recognition algorithm: a black man arrested earlier this year by the Detroit Police. The statement cited the unreliability of software, especially as applied to people of color.

With this context in mind, some university and high school instructors teaching about technology may be interested in engaging with the Black Lives Matter protests by teaching about computing, race, and surveillance.

I’m delighted that thanks to the generosity of Tawana Petty and others, ESC can share a module on this topic developed for an online course. You are free to make use of it in your own teaching, or you might just find the materials interesting (or shocking).

The lesson consists of a case study of Detroit’s Project Green Light, a new city-wide police surveillance system that involves automated facial recognition, real-time police monitoring, very-high-resolution imagery, cameras indoors on private property, a paid priority response system, a public/private partnership, and other distinctive features. The system has allegedly been deployed to target peaceful Black Lives Matter protesters.

Here is the lesson:

Race, Policing, and Detroit’s Project Green Light

Artificial Intelligence for and in Vocational Education and Training

June 30th, 2020 by Graham Attwell

Last week the Taccle AI project organised a workshop at the European Distance Education Network (EDEN) Annual Conference. The conference, which had been scheduled to be held in Romania, was moved online due to the Covid 19 pandemic.There were four short presentations followed by an online discussion.

Graham Attwell introduced the workshop and explained the aims of the Taccle AI project. In the next years, he said “AI will change learning, teaching, and education. The speed of technological change will be very fast, and it will create high pressure to transform educational practices, institutions, and policies.” He was followed by Vidmantas Tulys who focused on AI and human work. He put forward five scenarios for the future of work in the light of AI and the fourth industrial revolution. Ludger Deitmer looked at the changes in the mechatronics occupation due to the impact of AI. He examine dhow training was being redesigned to meet new curriculum and occupational needs and how AI was being introduced in the curriculum. Finally, Sofia Roppertz focused on AI for VET, exploring how AI can support access to education, collaborative environments and intelligent tutoring systems to support teachers and trainers.

AI cloud computing to support formative assessment in vocational education and training

June 30th, 2020 by Graham Attwell

geralt (CC0), Pixabay

I have written before abut the the great work being done around AI by Bolton College in the UK and particularly their ADA Chatbot.

One of my main interests about the use of AI in vocational education and training is the potential for freeing  up teachers for more personalized learning support for both those students who are struggling and also for the advanced students. At the moment too many teachers are forced by workloads to teaxh to the middle.

My second big hope is around assessment. Vocational students need, I think, regular feedback and that can come form formative assessment. However, at present teacher do not have the time to prepare and provide feedback on regular formative assessments. But with AI this become possible.

Bolton College previously received VocTech seed funding to prove the concept of using Artificial Intelligence (AI) to analyse short and long form answers and to demonstrate that real-time feedback can be offered to vocational learners as they respond to online open-ended formative assessment tasks.

Their FirstPass tool provided an initial introduction to AI cloud computing technologies which are able to support vocational students and their teachers with open-ended formative assessment tasks.

Now according to Ufi who provide Voctech fundiing, a new project :will provide further development of FirstPass to ensure that it is effective and robust in use and can demonstrably improve the teaching, learning and assessment experience of vocational learners. It will provide teachers with a richer medium for assessing students due to its ability to pose open-ended questions that can be automatically analysed and assessed by a computer, giving students real-time feedback and the opportunity to qualify and clarify their responses.”

  • Search Pontydysgu.org

    Social Media




    News Bites

    Cyborg patented?

    Forbes reports that Microsoft has obtained a patent for a “conversational chatbot of a specific person” created from images, recordings, participation in social networks, emails, letters, etc., coupled with the possible generation of a 2D or 3D model of the person.


    Racial bias in algorithms

    From the UK Open Data Institute’s Week in Data newsletter

    This week, Twitter apologised for racial bias within its image-cropping algorithm. The feature is designed to automatically crop images to highlight focal points – including faces. But, Twitter users discovered that, in practice, white faces were focused on, and black faces were cropped out. And, Twitter isn’t the only platform struggling with its algorithm – YouTube has also announced plans to bring back higher levels of human moderation for removing content, after its AI-centred approach resulted in over-censorship, with videos being removed at far higher rates than with human moderators.


    Gap between rich and poor university students widest for 12 years

    Via The Canary.

    The gap between poor students and their more affluent peers attending university has widened to its largest point for 12 years, according to data published by the Department for Education (DfE).

    Better-off pupils are significantly more likely to go to university than their more disadvantaged peers. And the gap between the two groups – 18.8 percentage points – is the widest it’s been since 2006/07.

    The latest statistics show that 26.3% of pupils eligible for FSMs went on to university in 2018/19, compared with 45.1% of those who did not receive free meals. Only 12.7% of white British males who were eligible for FSMs went to university by the age of 19. The progression rate has fallen slightly for the first time since 2011/12, according to the DfE analysis.


    Quality Training

    From Raconteur. A recent report by global learning consultancy Kineo examined the learning intentions of 8,000 employees across 13 different industries. It found a huge gap between the quality of training offered and the needs of employees. Of those surveyed, 85 per cent said they , with only 16 per cent of employees finding the learning programmes offered by their employers effective.


    Other Pontydysgu Spaces

    • Pontydysgu on the Web

      pbwiki
      Our Wikispace for teaching and learning
      Sounds of the Bazaar Radio LIVE
      Join our Sounds of the Bazaar Facebook goup. Just click on the logo above.

      We will be at Online Educa Berlin 2015. See the info above. The stream URL to play in your application is Stream URL or go to our new stream webpage here SoB Stream Page.

  • Twitter

  • Recent Posts

  • Archives

  • Meta

  • Categories