Archive for the ‘learning Analytics’ Category

Readings on AI and Education

November 18th, 2019 by Graham Attwell

In an early activity in our new project on Artificial Intelligence in Vocational Education and Training, we are undertaking a literature review. Although there seems to be little about AI and VET, the issue of AI in education is thsi years hot trend. Of course there seems to be more talk than actual practice. Any way, here is a quick summary (just notes really) of things I stumbled on last week.

Perhaps most interesting was an online webinar organised by the European Distance Education Network (EDEN) as part of European Distance Learning Week.  According to the online platform there were 49 of us present and four presentations. Sadly the recording is not yet available but I will link to it once it is online. What was most interesting was that almost everyone who spoke, and I recognised quite a few prominent researchers in the contributions, were pretty much opposed to AI. Too dangerous, no benefit, just hype, developers with no idea about learning etc. Really only one speaker, Alexandra Cristea from Durham University could see potential.

I found teh follwing publiscation by her. Demographic Indicators Influencing Learning Activities in MOOCs: Learning Analytics of Future Learn Courses (PDF) by Alexandra I. Cristea  and Lei Shi from the University of from Liverpool University  looks at pre-course survey data and online learner interaction data collected from two MOOCs, delivered by the University of Warwick,in 2015, 2016,and 2017. The data is used  to explore how learner demographic indicators may influence learner activities.Recommendations for educational information system development and instructional design, especially when a course attracts a diverse group of learners, are provided.

Meanwhile in the UK, NESTA are continuing to promote AI. However, they too emphasis ethical issues with the use of the technology. In ‘Educ-AI-tion rebooted? Exploring the future of artificial intelligence in schools and colleges’ they say

Although challenges for the ethical and responsible use of artificial intelligence and the sharing of data are common to many sectors, schools and colleges present a distinct combination of properties and considerations. The sharing of data needs to be governed in a manner that realises benefit for the public, and AIEd must be used ethically and responsibly.

AIEd’s potential and risks is reflected in the views of parents. 61% of parents anticipate that AI will be fairly or very important to the classroom of the near future. However, many are fairly or very concerned about consequences of determinism (77%), accountability (77%) and privacy and security (73%).

Finally, I had a look at the X5GON project website. X5GON is a large scale European research programme project, bringing togther a number of leading European Universities. It appears to be developing AI driven tools. particarrly focused on Open educational Resources. The project website says:

This new AI-driven platform will deliver OER content from everywhere, for the students’ need at the right time and place. This learning and development solution will use the following solutions to accomplish this goal:

  • Aggregation: It will gather relevant content in one place, from the projects case studies as well as external providers and other preferred resources.
  • Curation: AI and machine learning will be key to curate relevant and contextual content and external students at the right time and point of need.
  • Personalization: It will make increasingly personalized recommendations for learning content to suit students’ needs, based on the analysis of relevant factors.
  • Creation: Large, small and medium-sized universities have tacit knowledge that can be unlocked and re-used. This approach will allow any organization to release and build their own content libraries quickly and conveniently to share with the world and vice versa.

I’ll keep writing up my findings, in the form of notes on this site. And if anyone has any recommendations of what else I should be reading please add in the comments below.

Is this the right way to use machine learning in education?

September 2nd, 2019 by Graham Attwell

An article ‘Predicting Employment through Machine Learning‘ by Linsey S. Hugo on the National Association of Colleges and Employers web site,confirms some of my worries about the use of machine learning in education.

The article presents a scenario which it is said “illustrates the role that machine learning, a form of predictive analytics, can play in supporting student career outcomes.” It is based on a recent study at Ohio University (OHIO) which  leveraged machine learning to forecast successful job offers before graduation with 87 percent accuracy. “The study used data from first-destination surveys and registrar reports for undergraduate business school graduates from the 2016-2017 and 2017-2018 academic years. The study included data from 846 students for which outcomes were known; these data were then used in predicting outcomes for 212 students.”

A key step in the project was “identifying employability signals” based on the idea that “it is well-recognized that employers desire particular skills from undergraduate students, such as a strong work ethic, critical thinking, adept communication, and teamwork.” These signals were adapted as proxies for the “well recognised”skills.

The data were used to develop numerous machine learning models, from commonly recognized methodologies, such as logistic regression, to advanced, non-linear models, such as a support-vector machine. Following the development of the models, new student data points were added to determine if the model could predict those students’ employment status at graduation. It correctly predicted that 107 students would be employed at graduation and 78 students would not be employed at graduation—185 correct predictions out of 212 student records, an 87 percent accuracy rate.

Additionally, this research assessed sensitivity, identifying which input variables were most predictive. In this study, internships were the most predictive variable, followed by specific majors and then co-curricular activities.

As in many learning analytics applications the data could then be used as a basis for intervention to support students employability on gradation. If they has not already undertaken a summer internship then they could be supported in this and so on.

Now on the one hand this is an impressive development of learning analytics to support over worked careers advisers and to improve the chances of graduates finding a job. Also the detailed testing of different machine learning and AI approaches is both exemplary and unusually well documented.

However I still find myself uneasy with the project. Firstly it reduces the purpose of degree level education to employment. Secondly it accepts that employers call the shots through proxies based on unquestioned and unchallenged “well recognised skills” demanded by employers. It may be “well recognised” that employers are biased against certain social groups or have a preference for upper class students. Should this be incorporated in the algorithm. Thirdly it places responsibility for employability on the individual students, rather than looking more closely at societal factors in employment. It is also noted that participation in unpaid interneships is also an increasing factor in employment in the UK: fairly obviously the financial ability to undertake such unpaid work is the preserve of the more wealthy. And suppose that all students are assisted in achieving the “predictive input variable”. Does that mean they would all achieve employment on graduation? Graduate unemployment is not only predicated on individual student achievement (whatever variables are taken into account) but also on the availability of graduate jobs. In teh UK  many graduates are employed in what are classified as non graduate jobs (the classification system is something I will return to in another blog). But is this because they fail to develop their employability signals or simply because there simply are not enough jobs?

Having said all this, I remain optimistic about the role of learning analytics and AI in education and in careers guidance. But there are many issues to be discussed and pitfalls to overcome.

 

Travel to university time a factor in student performance

August 14th, 2019 by Graham Attwell

My summer morning’s work is settling into a routine. First I spend about half an hour learning Spanish on DuoLingo. Then I read the morning newsletters – OLDaily, WONKHE, The Canary and Times Higher Education (THE).

THE is probably the most boring of them. But this morning they led on an interesting and important research report. In an article entitled ‘Long commutes make students more likely to drop out’, Ana McKie says:

Students who have long commutes to their university may be more likely to drop out of their degrees, a study has found.

Researchers who examined undergraduate travel time and progression rates at six London universities found that duration of commute was a significant predictor of continuation at three institutions, even after other factors such as subject choice and entry qualifications were taken into account.

THE reports that the research., commissioned by London Higher, which represents universities in the city found that “at the six institutions in the study, many students had travel times of between 10 and 20 minutes, while many others traveled for between 40 and 90 minutes. Median travel times varied between 40 and 60 minutes.”

At one university, every additional 10 minutes of commuting reduced the likelihood of progression beyond end-of-first-year assessments by 1.5 per cent. At another, the prospect of continuation declined by 0.63 per cent with each additional 10 minutes of travel.

At yet another institution, a one-minute increase in commute was associated with a 0.6 per cent reduction in the chances of a student’s continuing, although at this university it was only journeys of more than 55 minutes that were particularly problematic for younger students, and this might reflect the area these students were traveling from.

I think there are a number of implications from this study. It is highly probable that those students traveling the longest distance are either living with their parents or cannot afford the increasingly expensive accommodation in central London. Thus this is effectively a barrier to less well off students. But it is also worth noting that much work in Learning Analytics has been focused on predicting students likely to drop out. Most reports suggest it is failing to complete or to success in initial assignments that is the most reliable predicate. Yet it may be that Learning Analytics needs to take a wider look at the social, cultural, environmental and financial context of student study with a view to providing more practical support for students.

I work on the LMI for All project which provides an API and open data for Labour Market Information for mainly use in careers counseling advice and guidance and to help young people choose their future carrers or education. We already provide data on travel to work distances, based on the 2010 UK census. But I am wondering if we should also provide data on housing costs,possibly on a zonal basis around universities (although I am not sure if their is reliable data). If distances (and time) traveling to college is so important in student attainment this may be a factor students need to include in their choice of institution and course.

 

Learning Analytics and AI for Future-Focused Learning

August 7th, 2019 by Graham Attwell

I’ve tended to be skeptical about Learning Analytics, seeing it of limited relevance to pedagogy and more concerned with managing learners (reducing dropouts) than having anything to say about learning. Even more, Learning Analytics research has tended to docus on higher education and formal learning, having little to say about workplace learning and vocational education and training. But things are changing, especially through the integration of AI with Learning Analytics Learning Analytics and AI for future focused learning. I’m especially interested in this since we have had a project approved under the Erasmus Plus programme on AI and vocational education and training teachers and trainers.

This presentation by Simon Buckingham Shum at the EduTECH conference in Australia in June of this year introduces some of the work at the UTS Connected Intelligence Centre, where, he says “the team has been refining (for the last 3 years) automated, personalised feedback to students on higher order transferable competencies (Graduate Attributes in university-speak, or General Capabilities in the schools sector) – namely, high performance face-to-face teamwork (exemplar: nursing simulation exercises), and critical, reflective thinking (as revealed in students’ writing).”

Simon says “Learning Analytics bring the power of data science and human-centred design to educational data, while AI makes new forms of timely assessment and feedback possible. Tech researchers, developers, educators and learners can co-design formative feedback on 21st century competencies such as critical reflective writing, teamwork, self-regulated learning, and dispositions for lifelong learning. Such tools are being coherently integrated into teaching practice and aligned with curriculum outcomes at UTS, and could be in schools.

Getting the technology’s capabilities and the user experience right is impossible without meaningful engagement with educators and students. So, this talk is organised around our emerging understanding of how to align the different elements of the whole sociotechnical infrastructure. To use the language of the framework – the ‘cogs’ can be tuned to different contexts, and must synchronise and drive in the same direction to create a coherent learning experience.”

A number of things strike me about the presentation (and the videos within the presentation).

The first is the integration of the LA Framework with more traditional educational frameworks including competences and assessment rubrics. These provide a much broader reference point for proxies for achievement and reflection than the relevant proxy used in LA (and indeed in many areas of education of achievement in examination and other assessments. The design process is intended to develop a data map to these proxies.

Secondly, rather than seeking to provide feedback to students on attainment (and likely attainment – or otherwise) or to serve as the basis for intervention by teachers, the focus is on reflection. The feedback is seen as “a  provocation to deeper discussion” and as “scaffolding reflection

Finally – and as part of the refection process – the LA is designed to provide agency for the student, who, says Simon “can push back against the machine” if they think it is wrong.

All in all, there is much content for reflection here. The slides which contain a number of references can be downloaded here (PDF).

Learning Analytics Cymru

June 25th, 2019 by Graham Attwell

Jisc report that “Learning Analytics Cymru is generating interest across the world.” The service, which has every higher education institution in Wales signed up and is supported by the Welsh Government, is the focus of a new article for US edtech organisation Educase.

In the piece Jisc consultant Niall Sclater’s  discusses the Learning Analytics Cymru model and how it provides a blueprint for delivering such services on a national scale.

By pooling resources, institutions are benefiting from opportunities to share experiences and learn collaboratively in the emerging field of learning analytics.

Foresight and the use of ICT for Learning

January 3rd, 2019 by Graham Attwell

Time to return to the Wales Wide Web after something of a hiatus in November and December. And I am looking forward to writing regular posts here again.

New year is a traditional time for reviewing the past year and predicting the future. I have never really indulged in this game but have spent the last two days undertaking a “landscape study” as part of an evaluation contract I am working on. And one section of it is around emerging technologies and foresight. So here is that section. I lay no claim to scientific methodology or indeed to comprehensiveness – this is just my take on what is going on – or not – and what might go on. In truth, I think the main conclusion is that very little is changing in the use of ICT for learning (perhaps  more on that tomorrow).

There are at any time a plethora of innovations and emerging developments in technology with the potential to impact on education, both in terms of curriculum and skills demands but also in their potential for teaching and learning. At the same time, educational technology has a tendency towards a ‘hype’ cycle, with prominence for particular technologies and approaches rising and fading. Some technologies, such as virtual worlds fade and disappear; others retreat from prominence only to re-emerge in the future. For that reason, foresight must be considered not just in terms of emerging technologies but in likely future uses of technologies, some which have been around some time, in education.

Emerging innovations on the horizon at present include the use of Big Data for Learning Analytics in education and the use of AI for Personalised Learning (see below); and MOOCS continue to proliferate.

VLEs and PLEs

There is renewed interest in a move from VLEs to Personal Learning Environments (PLE), although this seems to be reflected more in functionality for personalising VLEs than the emergence of new PLE applications. In part, this may be because of the need for more skills and competence from learners for self-directed learning than for the managed learning environment provided by VLEs. Personal Learning Networks have tended to be reliant on social networking application such as Facebook and Twitter. These have been adversely affected by concerns over privacy and fake news as well as realisation of the echo effect such applications engender. At the same time, there appears to be a rapid increase in the use of WhatsApp to build personal networks for exchanging information and knowledge. Indeed, one area of interest in foresight studies is the appropriation of commercial and consumer technologies for educational purposes.

Multi Media

Although hardly an emerging technology, the use of multimedia in education is likely to continue to increase, especially with the ease of making video. Podcasting is also growing rapidly and is like to have increasing impact in the education sector. Yet another relatively mature technology is the provision of digital e-books which, despite declining commercial sales, offer potential savings to educational authorities and can provide enhanced access to those with disabilities.

The use of data for policy and planning

The growing power of ICT based data applications and especially big data and AI are of increasing importance in education.

One use is in education policy and planning, providing near real-time intelligence in a wide number of areas including future numbers of school age children, school attendance, attainment, financial and resource provision and for TVET and Higher Education demand and provision in different subjects as well as providing insights into outcomes through for instance post-school trajectories and employment. More controversial issues is the use of educational data for comparing school performance, and by parents in choosing schools for their children.

Learning Analytics

A further rapid growth area is Learning Analytics (LA). LA has been defined as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.” [Reference] It is seen as assisting in informing decisions in education systems, promoting personalized learning and enabling adaptive pedagogies and practices. At least in the initial stages of development and use, Universities and schools have tended to harvest existing data drawn from Virtual Learning Environments (VLEs) and to analyse that data to both predict individual performance and undertake interventions which can for instance reduce drop-out rates. Other potential benefits include that LA can, for instance, allow teachers and trainers to assess the usefulness of learning materials, to increase their understanding of the learning environment in order to improve it, and to intervene to advise and assist learners. Perhaps more importantly, it can assist learners in monitoring and understanding their own activities and interactions and participation in individual and collaborative learning processes and help them to reflect on their learning.

Pardo and Siemens (YEAR?) point out that “LA is a moral practice and needs to focus on understanding instead of measuring.” In this understanding:

“learners are central agents and collaborators, learner identity and performance are dynamic variables, learning success and performance is complex and multidimensional, data collection and processing needs to be done with total transparency.”

Although initially LA has tended to be based on large data sets already available in universities, school based LA applications are being developed using teacher inputted data. This can allow teachers and understanding of the progress of individual pupils and possible reasons for barriers to learning.

Gamification

Educational games have been around for some time. The gamification of educational materials and programmes is still in its infancy and likely to continue to advance.  Another educational technology due for a revival is the development and use of e-Portfolios, as lifelong learning becomes more of a reality and employers seek evidence of job seekers current skills and competence.

Bite sized Learning

A further response to the changing demands in the workplace and the need for new skills and competence is “bite–sized” learning through very short learning modules. A linked development is micro-credentialing be it through Digital Badges or other forms of accreditation.

Learning Spaces

As ICT is increasingly adopted within education there will be a growing trend for redesigning learning spaces to reflect the different ways in which education is organised and new pedagogic approaches to learning with ICT. This includes the development of “makerspaces”. A makerspace is a collaborative work space inside a school, library or separate public/private facility for making, learning, exploring and sharing. Makerspaces typically provide access to a variety of maker equipment including 3D printers, laser cutters, computer numerical control (CNC) machines, soldering irons and even sewing machines.

Augmented and Virtual Reality

Despite the hype around Augmented Reality (AR) and Virtual Reality (VR), the present impact on education appears limited although immersive environments are being used for training in TVET and augmented reality applications are being used in some occupational training. In the medium-term mixed reality may become more widely used in education.

Wearables

Similarly, there is some experimentation in the use of wearable devices for instance in drama and the arts but widespread use may be some time away.

Block Chain

The block chain has been developed for storing crypto currencies and is attracting interest form educational technologists. Block chain is basically a secure ledger allowing the secure recording of a chain of data transactions. It has been suggested as a solution to the verification and storage of qualifications and credentials in education and even for recording the development and adoption of Open Educational Resources. Despite this, usage in education is presently very limited and there are quite serious technical barriers to its development and wider use.

The growing power of ICT based data applications and especially big data and AI (see section 10, below) are of increasing importance in education.

The use of data for policy and planning

One use is in education policy and planning, providing near real-time intelligence in a wide number of areas including future numbers of school age children, school attendance, attainment, financial and resource provision and for TVET and Higher Education demand and provision in different subjects as well as providing insights into outcomes through for instance post-school trajectories and employment. More controversial issues is the use of educational data for comparing school performance, and by parents in choosing schools for their children.

Learning Analytics

A rapid growth area is Learning Analytics (LA). LA has been defined as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.” [Reference] It is seen as assisting in informing decisions in education systems, promoting personalized learning and enabling adaptive pedagogies and practices. At least in the initial stages of development and use, Universities and schools have tended to harvest existing data drawn from Virtual Learning Environments (VLEs) and to analyse that data to both predict individual performance and undertake interventions which can for instance reduce drop-out rates. Other potential benefits include that LA can, for instance, allow teachers and trainers to assess the usefulness of learning materials, to increase their understanding of the learning environment in order to improve it, and to intervene to advise and assist learners. Perhaps more importantly, it can assist learners in monitoring and understanding their own activities and interactions and participation in individual and collaborative learning processes and help them to reflect on their learning.

Pardo and Siemens point out that “LA is a moral practice and needs to focus on understanding instead of measuring.” In this understanding:

“learners are central agents and collaborators, learner identity and performance are dynamic variables, learning success and performance is complex and multidimensional, data collection and processing needs to be done with total transparency.”

Although initially LA has tended to be based on large data sets already available in universities, school based LA applications are being developed using teacher in putted data. This can allow teachers and understanding of the progress of individual pupils and possible reasons for barriers to learning.

Artificial Intelligence

In research undertaken for this report, a number of interviewees raised the importance of Artificial Intelligence in education (although a number also believed it to be over hyped).

A recent report from the EU Joint Research Council (2018) says that:

“in the next years AI will change learning, teaching, and education. The speed of technological change will be very fast, and it will create high pressure to transform educational practices, institutions, and policies.”

It goes on to say AI will have:

“profound impacts on future labour markets, competence requirements, as well as in learning and teaching practices. As educational systems tend to adapt to the requirements of the industrial age, AI could make some functions of education obsolete and emphasize others. It may also enable new ways of teaching and learning.”

However, the report also considers that “How this potential is realized depends on how we understand learning, teaching and education in the emerging knowledge society and how we implement this understanding in practice.” Most importantly, the report says, “the level of meaningful activity—which in socio-cultural theories of learning underpins advanced forms of human intelligence and learning—remains beyond the current state of the AI art.”

Although AI systems are well suited to collecting informal evidence of skills, experience, and competence from open data sources, including social media, learner portfolios, and open badges, this creates both ethical and regulatory challenges. Furthermore, there is a danger that AI could actually replicate bad pedagogic approaches to learning.

The greatest potential of many of these technologies may be for informal and non-formal learning, raising the challenge of how to bring together informal and formal learning and to recognise the learning which occurs outside the classroom.

Data and the future of universities

August 2nd, 2018 by Graham Attwell

I’ve been doing quite a lot of thinking about how we use data in education. In the last few years two things have combined – the computing ability to collect and analyse large datasets, allied to the movement by many governments and administrative bodies towards open data.

Yet despite all the excitement and hype about the potential of using such data in education, it isn’t as easy as it sounds. I have written before about issues with Learning Analytics – in particular that is tends to be used for student management rather than for improving learning.

With others I have been working on how to use data in careers advice, guidance and counselling. I don’t envy young people today in trying to choose and  university or college course and career. Things got pretty tricky with the great recession of 2009. I think just before the banks collapsed we had been putting out data showing how banking was one of the fastest growing jobs in the UK. Add to the unstable economies and labour markets, the increasing impact of new technologies such as AI and robotics on future employment and it is very difficult for anyone to predict the jobs of the future. And the main impact may well be nots o much in new emerging occupations,or occupations disappearing but in the changing skills and knowledge required n different jobs.

One reaction to this from many governments including the UK has been to push the idea of employability. To make their point, they have tried to measure the outcomes of university education. But once more, just as student attainment is used as a proxy for learning in many learning analytics applications, pay is being used as a proxy for employability. Thus the Longitudinal Education Outcomes (LEO) survey, an experimental survey in the UK, users administrative data to measure the pay of graduates after 3, 5 and 0 years, per broad subject grouping per university. The trouble is that the survey does not record the places where graduates are working. And once thing we know for a certainty is that pay in most occupations in the UK is very different in different regions. The LEO survey present a wealth of data. But it is pretty hard to make any sense of it. A few things stand out. First is that UK labour markets look pretty chaotic. Secondly there are consistent gender disparities for graduates of the same subject group form individual universities. The third point is that prior attainment before entering university seems a pretty good predictor of future pay, post graduation. And we already know that prior attainment is closely related to social class.

A lot of this data is excellent for research purposes and it is great that it is being made available. But the collection and release of different data sets may also be ideologically determined in what we want potential students to be able to find out. In the same way by collecting particular data, this is designed to give a strong steer to the directions universities take in planning for the future. It may well be that a broader curriculum and more emphasis on process and learning would most benefits students. Yet the steer towards employability could be seen to encourage a narrower focus on the particular skills and knowledge employers say they want in the short term and inhibit the wider debates we should be having around learning and social inclusion.

 

Proxies, learning, deschooling society and annotation

May 11th, 2018 by Graham Attwell

Ivan_Illich_drawingSipping a glass of wine on the terrace last night, I thought about writing an article about proxies. I’ve become a bit obsessed about proxies, ever since looking at the way Learning Analytics seems to so often equate learning with achievement in examinations.

But then by chance this morning I ended up looking at the text of Ivan Illich’s 1969 publication ‘Deschooling Society‘. And I found in the first chapter Illich talks about about how we “confuse teaching with learning, grade advancement with education, a diploma with competence, and fluency with the ability to say something new.

He goes on to say pupils’ “imagination is “schooled” to accept service in place of value. Medical treatment is mistaken for health care, social work for the improvement of community life, police protection for safety, military poise for national security, the rat race for productive work. Health, learning, dignity, independence, and creative endeavour are defined as little more than the performance of the institutions which claim to serve these ends, and their improvement is made to depend on allocating more resources to the management of hospitals, schools, and other agencies in question.”

This seems an apposite comment on how the use and analysis of big data is being developed in the present period.

I stumbled on the Illich quote from a Twitter link to an exercise on the CLMOOC lets be creative together website. They ask “What would Ivan Illich think about CLMOOC?” and go on to suggest “we find activities like this all the more enjoyable and enriching when a variety of voices join the conversation. So this is an open invitation to the internet to join us as we use Hypothes.is to annotate an online copy of Deschooling Society together.”

I have not seen Hypothes.is before but it looks pretty nifty. I have never understood just why collective annotation has never quite taken off. It seems to me a great format for sharing and developing knowledge together. And I think Illich would have liked it.

Designing Learner Dashboards

May 2nd, 2018 by Graham Attwell


The UK Jisc are really good at producing on line reports of workshops and meetings (something which I am not!). This is one of the presentations from the Student Experience Experts Group meeting, two of which  events held every year to share the work of the student experience team at Jisc and to offer opportunities for feedback and consultation on current activities. The Jisc web page provides a brief summary of the meeting and all of the presentations. I picked this one by Liz Bennett from the University of Huddersfield because the issue of how to design dashboards is one which perplexes me at the moment.

Student satisfaction unrelated to learning behaviour and academic performance

March 13th, 2018 by Graham Attwell

I seem to spend a lot of time lately moaning about bad data practices. About approaches to learning analytics which appear to be based on looking at what data is available and the trying to think out what the question is. And particularly over the different proxies we use for learning.

So, I particularly liked the report in THE of the inaugural lecture by Professor Rienties at the UK Open Universitity’s Institute of Educational Technology. Professor Rienties outlined the results of a study that examined data on 111,256 students on 151 different modules at his institution. He found that student satisfaction, one of the most common used proxies for learning and achievement, is “unrelated” to learning behaviour and academic performance. According to THE:

Significantly higher student satisfaction was found in modules in which students received large amounts of learning materials and worked through them individually, than in courses where students had to collaborate and work together.

However, the best predictor for whether students actually passed the module was whether there were collaborative learning activities, such as discussion forums and online tuition sessions.

Students who were “spoon-fed” learning materials also spent less time in the virtual learning environment, were less engaged, and were less likely to remain active over time than their peers engaged in more collaborative activities.