Archive for the ‘AI’ Category

FutureLearn team up with Microsoft for online AI course

November 18th, 2020 by Graham Attwell

As many of you will know, FutureLearn is the UK Open Universities MOOC arm, run in conjunction with an international consortium of universities. But, I guess like everyone else, FutureLearn is under pressure to make some money. Their first go was offering paid for certificates for course completion. Another attempt has been to persuade people to sign up for an annual subscription, keeping courses open for a year if they pay.

The latest is to partner with industries for courses providing micro accreditation, in some cases industry recognised. So in December Future Learning is launching “Artificial Intelligence on Microsoft Azure: Machine Learning and Python Basics‘, created by CloudSwft and inc conjunction with Microsoft. “On this microcredential”, they say ” you’ll address this challenge by developing key AI skills that can serve as the first steps towards becoming an AI engineer, business analyst, or AI professional.” And, “Yes. If you successfully complete this microcredential, you’ll receive a voucher to sit a separate exam to earn the Microsoft Azure AI Fundamentals (AI-900) and Microsoft Azure AI Engineer Associate (AI-100) certification.”

Why would FutureLearn be giving away vouchers for sitting Microsoft exams. It could be because the 15 week course costs 584 Euros to enroll.  Much as I like microcredentially, this seems a long way from FutureLearn’s past MOOCs free for participation. And if as the course information claims, “artificial intelligence skills are frequently listed among the most in-demand workplace skills in the current and future job market, as organisations seek to harness AI to revolutionise their operations” and “employers are faced with a shortfall of qualified candidates” surely this is an area where public education and trainings services should be providing online course, rather than restricting access to those who can afford to pay for learning new skills.

 

Please follow and like us:

Workshop on Ai and Vocational Education as part of European Vocational Skills Week

November 9th, 2020 by Graham Attwell

geralt (CC0), Pixabay

This week is European Vocational Skills Week.

And as a partner of the European Vocational Skills Week the Taccle AI project, is organising an online workshop on “Artificial Intelligence for and in VET” on Tuesday 10 November 15:00 – 16:30 CET. Our Taccle AI project partners from five European countries will welcome you.

About the Workshop:

AI is particularly important for vocational education and training (VET) as it promises profound changes in employment and work tasks. Not only are some jobs vulnerable and new jobs likely to be created but there will be changing tasks and roles within jobs, requiring changes in initial and continuing training, for those in work as well as those seeking employment. This will require changes in existing VET content, new programmes such as the design of AI systems in different sectors, and adaptation to new ways of cooperative work with AI.

For VET teachers and trainers there are many possible uses of AI including new opportunities for adapting learning content based on student’s needs, new processes for assessment, analysing possible bottlenecks in learners’ domain understanding and improvement in guidance for learners

In our workshop we will explore these issues with short inputs and breakout sessions for discussion by participants around key issues.

Register now (here)!

Please follow and like us:

Finishing the last reports for TACCLE4 CPD project – Handing over the torch for other runners

October 31st, 2020 by Pekka Kamarainen

During the last few months I have surprised myself by producing three new reports for the ongoing EU-funded TACCLE4 CPD project. As regular readers of this blog will know, the project has been working with strategies for promoting digital competences of teachers and trainers in different educational sectors. And, as a contrast to the three earlier TACCLE projects, the fourth one had the task to shape models and concepts for continuing professional development in educational establishments and training organisations. Furthermore, my role in the project has been to address the task for the field of vocational education and training (VET) and to bring into the project the legacy of the Learning Layers project (in which I had been working for many years).

Now, the project is in its final phase and at the same time my career as an active researchers has come to an end. During the last few weeks I have surprised myself and others by producing three new reports for the project – in addition to the five ones that I had produced by the end of last year. Below, I want to make some comments on these newer reports and how they enrich the group picture of the earlier VET-related reports.

Report 5: Promoting digital competences beyond the accustomed realm of ICT skills

During the project I had been writing blogs on innovative activities to promote digital competences via civic learning and via introduction of specific applications of the Learning Toolbox (the main product developed in the Learning Layers project). To me, the Finnish idea of developing a broad-based introductory course on artificial intelligence (AI) and its impact for the entire civil society was very inspiring. Equally, the recent progress in using the Learning Toolbox (LTB) at different contexts was impressive. In the initial pilot context, vocational trainers made new efforts to support vocational learning during the lockdown. and after the closure period they added new features to using LTB in apprentice training.

Parallel to this, other applications of LTB – mainly the use  of LTB to shape ePosters for conferences (that are shifting from face-to-face events into online events) is becoming widespread. Here, it is worthwhile to note the spread from regular conferences into other kinds of of online events – such as multiplier events of EU-funded projects. Furthermore, I got informed of the progress with LTB showcases. Firstly they had been shaped for particular conferences to give a group picture of the ePosters for the respective conference. In a more mature phase the LTB showcase was used to give a group picture of all conferences and online events that were working with ePosters – of which some exemplars were portrayed in this ‘all stars’ showcase. I was happy to give visibility to the start-up company Kubify (founded by the former LTB-developers) due to their latest achievements. Altogether, this report demonstrated the progress of promoting digital competences beyond the accustomed realms of ICT skills.

Activity Report on the German Multiplier Event at the Training Centre Bau-ABC Rostrup

During my working visit in Bremen (after a long break) I had a chance to organise a Multiplier Event that took place as face-to-face event in the training centre Bau-ABC Rostrup (with whom we had worked intensively in the Learning Layers project). For me it was a pleasant opportunity to meet several trainers of the centre – some of which had been our counterparts for years, whilst some were newcomers. In my input I gave an overview, how the co-design processes (that led to to the shaping of the LTB) and the parallel training activities (that were completed with training based on Theme Rooms) worked hand in hand in the Learning Layers project. Then I gave some insights into the idea of different innovation paths (for introducing digital competences in the field of VET) and how they can be addressed in a revisited framework for Theme Room Training 2020.  In addition to this I presented the new TACCLE4 CPD Showcase that pulled together the VET-related work in the Learning Layers project and in the current project.

It was interesting to note that the discussion moved on from my inputs to the practical challenges (regarding the use of LTB as support for training) and to the possibilities to take further steps with their internal training. So, in this respect the session became a genuine multiplier event. Indeed, it was concluded with a commitment to start a new round of theme room sessions – based on the ideas and needs of the active trainers and with focus on improving the use of LTB in their context.

Report 2b: The TACCLE4 CPD Showcase as a new collection of online resources

My final report for the project TACCLE4 CPD presented the above-mentioned LTB showcase for this project and how it came into being. As the pre-history I presented the earlier pilot activity to introduce the work with ePosters into the European Conference on Educational Research (ECER) in the year 2018 in Bolzano. This was a limited pilot project with some ePosters for the network program of VETNET (the European Research Network for Vocational Education and Training). As a further step from this phase the report introduced an overarching showcase for the TACCLE4 CPD project. This showcase provided on the one hand comprehensive LTB-stacks thet presented all VET-related reports for the Learning Layers project and for the TACCLE4 CPD project. Then, it introduced thematic stacks that focused ot the use of Open Educational Resources (OER) in the field of VET and outlined the Theme Room Training 2020 framework.

Altogther, this report gave a picture of a new kind of an online resource environment. At the same time it invited the users to consider their own innovation paths and to think of their own ways to shape training with Theme Rooms.

I guess this is enough of the latest reports and of the messages I want to pass at the final phase of the project. Now that I am finishing my career as an active researcher, I have the feeling of being a runner with the torch of the Olympic Games. I am coming to the point in which I have to had out the torch for new runners – the ones in research, the ones in the software development and the ones developing their training approaches. My message to them is the following: You need to take the fire from the past, not the ashes. And: You need not go back to the long and winding roads of your predecessors – you need to find each other in the present date circumstances.

More blogs to come … (but from a different perspective)

Please follow and like us:

Data Driven Science

September 29th, 2020 by Graham Attwell

This diagram is from a tweet by  Data Driven Science (@DrivenScience).

Artificial Intelligence they say, is the broad discipline of creating intelligent machines.

Machine Learning refers to systems that can learn from experience.

Deep Learning refers to experience on large data sets.

Please follow and like us:

The State of Data 2020

September 28th, 2020 by Graham Attwell
social media, media, board

geralt (CC0), Pixabay

One result of the Covid 19 pandemic is it seems like every day now there are free events. This week is no exception and this conference looks great. I can’t make all of it – too many other meetings but I hope to dip in and out (another advantage of online conferences).

On Tuesday September 29 and Wednesday September 30, 2020 the State of Data event will bring together researchers, practitioners, and anyone with an interest in why data matters in state education in England.

You can choose to register if you want to use the calendar functionality and accept the privacy terms of Hopin, to see the events as they come live. Or simply watch in your own time without registering, after the event, via the links below.

Between algorithmic fairness in exam moderation and the rush to remote learning in response to the COVID-19 pandemic, 2020 has raised questions on children’s digital rights like never before in England’s education system. defenddigitalme is a call to action.

The conference has a vision of a rights’ respecting environment in the state education sector in England. We want to help build the future of safe, fair and transparent use of data across the public sector. This event will coincide with the launch of their report The State of Data 2020: mapping the data landscape in England’s state education system.

There is a range of content and discussion for practitioners in education and data protection, senior leadership and DPOs, local authority staff, developers, vendors and the edTech community, academics and activists, policy advisors and politicians —they say they want to create opportunities for questions and answers across silos. As the conference web site says: “We need to start a conversation about changing policy and practice when it comes to children’s data rights in education.”

Please follow and like us:

Racial bias in algorithms

September 25th, 2020 by Graham Attwell

From the UK Open Data Institute’s Week in Data newsletter

This week, Twitter apologised for racial bias within its image-cropping algorithm. The feature is designed to automatically crop images to highlight focal points – including faces. But, Twitter users discovered that, in practice, white faces were focused on, and black faces were cropped out. And, Twitter isn’t the only platform struggling with its algorithm – YouTube has also announced plans to bring back higher levels of human moderation for removing content, after its AI-centred approach resulted in over-censorship, with videos being removed at far higher rates than with human moderators.

Please follow and like us:

Accountability and algorithmic systems

September 3rd, 2020 by Graham Attwell
programming, computer language, program

geralt (CC0), Pixabay

There seems to be a growing awareness of the use and problems with algorithms – at least in the UK with what Boris Johnson called “a rogue algorithm” caused chaos in students exam results. It is becoming very apparent that there needs to be far more transparency in what algorithms are being designed to do.

Writing in Social Europe says “Algorithmic systems are a new front line for unions as well as a challenge to workers’ rights to autonomy.” She draws attention to the increasing surveillance and monitoring of workers at home or in the workplace. She says strong trade union responses are immediately required to balance out the power asymmetry between bosses and workers and to safeguard workers’ privacy and human rights. She also says that improvements to collective agreements as well as to regulatory environments are urgently needed.

Perhaps her most important argument is about the use of algorithms:

Shop stewards must be party to the ex-ante and, importantly, the ex-post evaluations of an algorithmic system. Is it fulfilling its purpose? Is it biased? If so, how can the parties mitigate this bias? What are the negotiated trade-offs? Is the system in compliance with laws and regulations? Both the predicted and realised outcomes must be logged for future reference. This model will serve to hold management accountable for the use of algorithmic systems and the steps they will take to reduce or, better, eradicate bias and discrimination.

Christina Colclough believes the governance of algorithmic systems will require new structures, union capacity-building and management transparency.I can’t disagree with that. But also what is needed is a greater understanding of the use of AI and algorithms – for good and for bad. This means an education campaign – in trade unions but also for the wider public to ensure that developments are for the good and not just another step in the progress of Surveillance Capitalism.

Please follow and like us:

Algorithmic bias explained

August 27th, 2020 by Graham Attwell

Yesterday, UK Prime Minister blamed last weeks fiasco with public examinations on a “mutant algorithm”. This video by the  Institute for Public Policy Research provides a more rational view on why algorithms can go wrong. Algorithms, they say, risk magnifying human bias and error on an unprecedented scale. Rachel Statham explains how they work and why we have to ensure they don’t perpetuate historic forms of discrimination.

Please follow and like us:

Understanding the changing Covid-19 labour market

August 26th, 2020 by Graham Attwell
looking for a job, work, silhouettes

geralt (CC0), Pixabay

Yesterday I attended a webinar organized by the UK Association of Colleges in their Labour Market Observatory Series. The subject of the webinar was Using Job Posting Analytics to understand the changing Covid-19 labour market.

Understanding labour markets is a hard job at the best of time and the Covid-19 pandemic and the resulting lockdown have disrupted the economy with unprecedented speed and scale. As Duncan Brown, Senior Economist from Emsi, explained, raditional labour market statistics take time to emerge, especially to understand what’s going at regional and local level, and real-time indicators become all-important. Duncan Brown, talked through what their Job Posting Analytics – derived from collecting (or scraping) around 200,000 new, unique job postings from job boards across the internet every week — can tell us about where and how the labour market is changing and what to look for as we move into the recovery.

First though he explained how the data is collected using bots before being cleaned and duplication removed, prior to using algorithms to analyse the data. He pointed out that there are limitations to the data derived from job adverts but compared to the time taken for official labour market data to emerge, for instance through the UK National Office of Statistics Labour Force Survey (LFS)job posting analytics can provide an almost real time snapshot view of the labour market, and is easily projected at a local level.

My notes on the webinar are somewhat patchy but here are a few take home points, particularly from a question and answer session that followed Duncan Brown’s presentation.

There was a huge fall in online job adverts in April and May with the lockdown – as high as 80 per cent in some sectors and localities. Since then there has been a steady recovery in the number of jobs being advertised online but this recovery is uneven between different sectors and different cities and regions.

As examples offers of employment in the food and hospitality. Industries remain dire and aerospace is also still badly hit. On the other hand, job advert volumes in manufacturing have substantially recovered and, perhaps understandably there is an increase in jobs adverts in health care.

There is considerable differences as to how far the volume of job adverts has recovered (or otherwise) in different cities. In general, it would appear that those cities with the largest percentage of office work and of commuters are doing worse: London in particular.

One area of the labour market that Emsi is focusing on is skills demand. They have developed their own skills directory, which Duncan Brown said, now contains over 3000 skills and are running a project funded by Nesta to see if these skills can be clustered around different occupations. Yet despite the so-called pivot to skills, he said there few signs that employers were. Moving away from the traditional emphasis on qualifications. However, qualification demands often did not appear in job adverts but rather tended to be assumed by both employers and job applicants. For instance, someone applying for a job as an accountant would presume that they needed formal qualifications.

Although there have long been predictions over the impact of automation and AI on employment, Duncan Brown said there was little evidence of this. His feeling is that, at least in the UK, the existence of relatively cheap labour in many sectors where it would be relatively easy to automate tasks, was a disincentive to the necessary investment. He thought that labour costs may have been kept down by immigration. He pointed to car washes as an example of an area where far from advancing automation had actually gone backwards.

The slides from the presentation and a recording of the webinar will be available from 27 August on the Association of Colleges website.

 

Please follow and like us:

AI and Algorithms: the UK examination debacle

August 20th, 2020 by Graham Attwell

This article was originally published on the Taccle AI web site.

There’s a lot to think about in the ongoing debacle over exam results in the UK. A quick update for those who have not been following the story. Examinations for young people, including the O level and A level academic exams and the more vocationally oriented Btec were cancelled this year due to the Covid19 pandemic. Instead teachers were asked to firstly provide an estimated grade for each student in each subject and secondly to rank order the students in their school.

These results were sent to a government central agency, the Office of Qualifications known as Ofqual. But instead of awarding qualifications to students based on the teachers’ predicted grades, it was decided by Ofqual, seemingly in consultation or more probably under pressure, by the government to use an algorithm to calculate grades. This was basically based on the previous results achieved by the school in each subject, with adjustments made for small class cohorts and according to the rankings.

The results from the A levels were released last week. They showed massive irregularities at an individual level with some students seemingly downgraded from predicted A* *the highest grade, to a C or D. Analysis also showed that those students from expensive private schools tended to do better than expected, whilst students from public sector schools in working class areas did proportionately worse than predicted. In other words, the algorithm was biased.

As soon as the A level results were announced there were protest from teachers, schools and students. Yet the government stuck to its position, saying there would be no changes. The Prime Minister Boris Johnson said “Let’s be in no doubt about it, the exam results wed have got today are robust, they’re good, they’re dependable for employers”. However, concern quickly grew about the potential of numerous appeals and indeed at the time it would take teachers preparing such appeals. Meanwhile the Scottish government (which is autonomous in education policy) announced that they would revert of the teachers’ predicted grades. In England while the government stood firm demonstrations by school students broke out in most cities. By the weekend it was clear that something had to change and on Monday the UK government, responsible for exams in England and Wales, announced that they too would respect teacher predicted grades.

The political fallout goes on. The government is trying to shift the blame to Ofqual, despite clear evidence that they knew what was happening.  Meanwhile some of the universities who are reliant on the grades for the decision over who to offer places to, are massively oversubscribed as a result of the upgrades.

So, what does this mean for the use of AI in education. One answer maybe that there needs to be careful thinking about how data is collected and used. As one newspaper columnist put it as the weekend “Shit in, shit out”. Essentially the data used was from the exam results of students at a collective school level in previous years. This has little or no relevance as to how an individual student might perform this year. In fact, the algorithm was designed with the purpose not of awarding an appropriate grade for a student to reflect their learning and work, but to prevent what is known as grade inflation. Grade inflation is increasing numbers of students getting higher grades each year. The government sees this as a major problem.

But this in turn has sparked off a major debate, with suspicions that the government does in fact support a bias in results, aiming to empower the elite to attend university with the rest heading for a second class vocational education and training provision. It has also been pointed out that the Prime Ministers senior advisor, Dominic Cummings, has in the past written articles appearing to suggest that upper class students are more inherently intelligent than those from the working class.

The algorithm, although blunt in terms of impact, merely replicated processes that have been followed for many years (and certainly preceding big data). Many years ago, I worked as a project officer for the Wales Joint Education Committee (WJEC). The WJEC was the examination board for Wales. At that time there were quite a number of recognized examination boards, although since then the number has been reduced by mergers. I was good friends with a senior manager in the exam board. And he told me that every year, about a week before the results were announced, each exam board shared their results, including the number of students to be awarded each grade. The results were then adjusted to fit the figures that the boards had agreed to award in that year.

And this gets to the heart of the problems with the UK assessment system. Of course, one issue is the ridiculous importance placed on formal examinations. But it also reflects the approach to assessment. Basically, there are three assessment systems. Criteria based assessment means that any students achieving a set criterion are awarded accordingly. Ipsative based assessment, assesses achievement based on the individuals own previous performance. But in the case of UK national exams the system followed is norm referenced, which means that a norm is set for passing and for grading. This is fundamentally unfair, in that if the cohort for one year is high achieving the norm will be raised to ensure that the numbers achieving any particular grade meet the desired target. The algorithm applied by Ofqual weas essentially designed to ensure results complied with the norm, regardless of individual attainment. It has always been done this way, the difference this year was the blatant crudeness of the system.

So, there is a silver lining, despite the stress and distress caused for thousands of students. At last there is a focus on how the examination system works, or rather does not. And there is a focus on the class-based bias of the system which has always been there. However, it would be a shame if the experience prevents people from looking at the potential of AI, not for rigging examination results, but for supporting the introduction of formative assessment or students to support their learning.

If you are interested in understanding more about how the AI based algorithm worked there is an excellent analysis by Tom Haines in his blog post ‘A levels: the Model is not the Student‘.

 

Please follow and like us:
  • Search Pontydysgu.org

    Social Media




    News Bites

    Racial bias in algorithms

    From the UK Open Data Institute’s Week in Data newsletter

    This week, Twitter apologised for racial bias within its image-cropping algorithm. The feature is designed to automatically crop images to highlight focal points – including faces. But, Twitter users discovered that, in practice, white faces were focused on, and black faces were cropped out. And, Twitter isn’t the only platform struggling with its algorithm – YouTube has also announced plans to bring back higher levels of human moderation for removing content, after its AI-centred approach resulted in over-censorship, with videos being removed at far higher rates than with human moderators.

    Please follow and like us:


    Gap between rich and poor university students widest for 12 years

    Via The Canary.

    The gap between poor students and their more affluent peers attending university has widened to its largest point for 12 years, according to data published by the Department for Education (DfE).

    Better-off pupils are significantly more likely to go to university than their more disadvantaged peers. And the gap between the two groups – 18.8 percentage points – is the widest it’s been since 2006/07.

    The latest statistics show that 26.3% of pupils eligible for FSMs went on to university in 2018/19, compared with 45.1% of those who did not receive free meals. Only 12.7% of white British males who were eligible for FSMs went to university by the age of 19. The progression rate has fallen slightly for the first time since 2011/12, according to the DfE analysis.

    Please follow and like us:


    Quality Training

    From Raconteur. A recent report by global learning consultancy Kineo examined the learning intentions of 8,000 employees across 13 different industries. It found a huge gap between the quality of training offered and the needs of employees. Of those surveyed, 85 per cent said they , with only 16 per cent of employees finding the learning programmes offered by their employers effective.

    Please follow and like us:


    News from 1994

    This is from a Tweet. In 1994 Stephen Heppell wrote in something called SCET” “Teachers are fundamental to this. They are professionals of considerable calibre. They are skilled at observing their students’ capability and progressing it. They are creative and imaginative but the curriculum must give them space and opportunity to explore the new potential for learning that technology offers.” Nothing changes!

    Please follow and like us:


    Other Pontydysgu Spaces

    • Pontydysgu on the Web

      pbwiki
      Our Wikispace for teaching and learning
      Sounds of the Bazaar Radio LIVE
      Join our Sounds of the Bazaar Facebook goup. Just click on the logo above.

      We will be at Online Educa Berlin 2015. See the info above. The stream URL to play in your application is Stream URL or go to our new stream webpage here SoB Stream Page.

      Please follow and like us:
  • Twitter

  • RT @YvetteTaylor0 Sneak preview of work in progress with @samiasingh_art & with contributions from @strath_fem speakers. More soon! pic.twitter.com/bjOr0WBieS

    Yesterday from Cristina Costa's Twitter via Twitter for Android

  • Recent Posts

  • Archives

  • Meta

  • Categories