Archive for the ‘Initial assessment; the student experience’ Category

Students’ perspective

February 2nd, 2011 by Angela Rees

I wrote in my dissertation draft (Version 1037.2) that the reason I was surveying students about their experience of initial assessment was because I thought that they were more likely to give candid responses than their lecturers.

I beleive that academic staff would have answered my questions about assessment based on; what they think they should say, what their organisations policy says is done, and what they think they’ve done. Whereas a student would say; “we did this” or “we didn’t do that”.

Now of course I have to find some evidence to back up my assumption.

Filed under: Initial assessment; the student experience

Digital story telling stops plagiarism!

June 21st, 2010 by Graham Attwell

There’s an interesting aside in an article in today’s Guardian newspaper on the so called problems of plagiarism. Why do I say so called? Whilst I would agree that practices of buying and selling essays are a problem, these practices have always gone on. When, many years again pre-internet days, I was a student at Swansea University, it was always possible to buy an essay in a bar. And I would also argue that a side benefit of cut and stick technologies is that standards of referencing in universities today is much higher than it was in my time as student. Indeed at that time, you were expected to buy your tutors’ textbooks and to paraphrase (plagiarise) their work. Plagiarism is as much a social construct as it is a technological issue.

But coming back to today’s article, reporting on a three day international conference on plagiarism at Northumbria University, the Guardian reports that “The conference will also hear that the problem of plagiarism at university could be reduced if students used “digital storytelling” – creating packages of images and voiceovers – rather than essays to explain their learning from an imagined personal perspective.

Phil Davies, senior lecturer at Glamorgan university’s computing school, said he had been using the technique for two years and had not seen any evidence of cheating. “Students find it really hard but it’s very rewarding, because they’re not copying and writing an essay, they have to think about it and bring their research into a personal presentation.”

Another approach is to focus on authentic assessment – or rather assessment of authentic learning tasks. In this case students are encouraged to use the internet for research but have to reflect on and re-purpose materials for reporting on their own individual research.

In both cases this goes beyond dealing plagiarism – it is good practice in teaching and learning. And I wonder if that might be a better starting point for the efforts of researchers, developers and teachers.

Numeracy and Literacy initial assessments are apparently not happening

April 20th, 2010 by Angela Rees

After throwing out the non-completers and the time wasters I have a set of 31 survey results. Admittedly not a huge sample but since each of them represents a student in a separate institution, some interesting patterns are emerging.

As expected, all of the students provided their university or college with their name, address and date of birth before their course commenced. All but one reported that they had also provided information about their existing qualifications, and the grades they had attained. All but one respondent provided their university or college with a personal statement.

The next most common information provided prior to the course starting was references, 17 respondents reported giving this information. Less than a third of respondents gave any further information before the course started and even fewer gave any of the information once the course had commenced.

Numeracy ability was provided prior to the course starting in 8 cases and afterwards in 1. Literacy ability was provided prior to the start of the course in 7 cases and not at all once courses had started. Which suggests that 23 of the colleges or universities sampled did not carry out initial assessment in numeracy or literacy. Either that or 23 students didn’t realise they were being assessed!

Filed under: Initial assessment; the student experience

Schoolboy errors

February 12th, 2010 by Angela Rees

Analysing the results of my dissertation survey I realised that I’d forgotten to ask my respondents some basic information. At the time the age and sex didn’t seem like relevant information but once the results were in and I discovered that I had around 60 Postgrads with literacy problems, I started to question why, were they male or female, was English even their first language? So here goes Mark2 http://www.surveymonkey.com/s/screeningstudents fingers crossed!

Filed under: Initial assessment; the student experience

Information overload

June 12th, 2009 by Angela Rees

Every year, colleges and universities invest large amounts of time and money into the recruitment of students. The process can involve numerous steps from the obligatory form filling, interviews, conditional and unconditional offers to open day events and induction weeks. The purpose of these stages are two directional, the institution wants to fill its courses with suitable candidates, the student wants to make sure that they have chosen a suitable institution. During this period, vast quantities of information exchanges hands. I want to find out just how much this information is used to impact upon the student learning experience.

Much of the data collected is not used to benefit the teaching or learning process, instead it is collated and used to produce statistics for institutions and to obtain funding. There may be little information in application and enrolment forms of any worth to subject teachers, (they were never intended to be used in that way) if this is the case then the whole process could be a missed opportunity.

Many institutions do collect Initial Assessment data, the theory being that by taking a series of short tests an idea of the students’ levels of literacy and numeracy can be gauged, those falling below a certain level can then be offered further assistance. Having recently been at both ends of this process, as a teacher and a student, it became apparent that Initial Assessment was merely a box ticking exercise. I received no feedback about my own initial assessment and I am still waiting at the end of the school year for the required further assistance to be granted to my own students.

I am curious to learn about other students’ experiences and hope to uncover a more positive view of initial assessment in the UK. I’m not asking the institutions because I want to focus on practice not policy. I would also like to find examples of good practice and ways of improving the system for everyone.

Posted in Initial assessment; the student experience

Survey away!

June 8th, 2009 by Angela Rees

Finally got around to finishing the survey last week, tried it out on my sisters and a student friend (thanks all) who suggested a few tweaks to the language.

The final version is here.

Now I’ve got to get it to as many students in the UK as possible. Hope they don’t mind breaking from revision for five minutes!

Posted in Initial assessment; the student experience

Initial Assessment Tools

May 22nd, 2009 by Angela Rees

The Search for Spock initial assessment tools;

DfES recommend the following resources via their read write plus campaign http://www.dcsf.gov.uk/readwriteplus/

The Basic Skills Agency has produced “skills check” tools as a way of screening students to see if further initial assessment is required. http://www.excellencegateway.org.uk/toolslibrary

Cambridge Training and Development (CTAD) “Target Skills” http://www.targetskills.net/ not much info here, but leads to publications pages.

West Nottinghamshire College have produced a Basic Keyskillbuilder although the links all lead to password protected moodle site where you can’t even log in as a guest user – so no use for anyone wishing to follow examples of good practice.

The only useful one here so far is the Basic Skills agency, available resources are; “Good Practice Guidelines” http://www.excellencegateway.org.uk/pdf/Good_Practice_Guide.pdf which is subtitled “for literacy, language and numeracy teachers, subject support staff and adult learner supporters” . Who are these guidelines designed to help?

“Skills for Life is an ambitious strategy that is designed to address literacy, language and numeracy needs of all adults and young people. It covers all post-16 learners on learning programmes at levels from pre-Entry up to and including Level 2.”

So not all adults and young people as it doesn’t include level 3+ courses?

New search;

“Initial assessment in HE” returned the following document written in 2001, http://readingroom.lsc.gov.uk/pre2005/quality/goodpractice/initial-assessment-of-learning-and-support-needs-and-planning-learning-to-meet-needs.pdf which highlights some good points (this is from 2001 not found anything more recent yet…)

The trend appears to be to assess basic skills; literacy and numeracy, upon commencing a course, those who need help with either are picked up and support offered such as testing for dyslexia, there are relatively few references to screening mainstream students and where they are screened, it is only for “basic skills”

Otherwise existing qualifications are used as the startpoint to compare gaining new skills and knowledge against. But this doesn’t help determine what additional support the students need.

Posted in Initial assessment; the student experience

What questions to ask?

May 10th, 2009 by Angela Rees

The plan is to get the student perspective, find out what assessment is carried out and how the results benefit them. I have students who sat through literacy screening in September and still haven’t had the follow up support they need 8 months later. Maybe I work for a particularly bad example, maybe initial assessment is just another hoop to jump through, but why bother collecting the data if you’re not going to use it for the benefit of your students?

If I use the Ofsted criteria as a guide, what questions should I be asking the students?

Specialist staff? Who carries out the assessment? Course lecturer, personal tutor, specialist, online, self assessment?

Appropriate time? Start of the course? Interview?

Starting point, learning plan? Is it relevant, are the targets attainable, is it updated regularly?

Screening for literacy/numeracy?

Results, feedback?

Appropriate support?

What tools already exist for evaluating an initial assessment policy? This one from QIA helps staff to discuss initial assessment; http://excellence.qia.org.uk/media/attachments/108534/Initial-Assessment-Tool-v6.pdf

I’m off to play around with surveymonkey.com

Posted in Initial assessment; the student experience

Searching for literature to review

May 10th, 2009 by Angela Rees

The majority of available literature on the topic of initial assessment consists of policy documents aimed at providers of ESOL and Basic Skills education. There have been a number of government initiatives in the UK aimed at improving standards in literacy and numeracy.

The QCA says schools with half decent admissions procedures are the ones which will adequately identify support needs;  http://www.qca.org.uk/qca_7143.aspx

Ofsted www.excellencegateway.org.uk recommends “STL” as example of good practice… so off I went to Google STL; http://www.stltraining.co.uk/ and then emailed someone from STL to enquire about their assessment policy. One month on and I’ve heard nothing.

Anyway, the general guidance from Ofsted is that good practice in initial assessment must have;

  • Team approach to support
  • Detailed initial assessment
  • ILPs

What do their inspectors look out for?…

http://excellence.qia.org.uk/page.aspx?o=11F8361F-F327-4D61-BF27-43190802FECD

So now we know.

Posted in Initial assessment; the student experience

PlayPlay
  • Search Pontydysgu.org

    Social Media




    News Bites

    Cyborg patented?

    Forbes reports that Microsoft has obtained a patent for a “conversational chatbot of a specific person” created from images, recordings, participation in social networks, emails, letters, etc., coupled with the possible generation of a 2D or 3D model of the person.


    Racial bias in algorithms

    From the UK Open Data Institute’s Week in Data newsletter

    This week, Twitter apologised for racial bias within its image-cropping algorithm. The feature is designed to automatically crop images to highlight focal points – including faces. But, Twitter users discovered that, in practice, white faces were focused on, and black faces were cropped out. And, Twitter isn’t the only platform struggling with its algorithm – YouTube has also announced plans to bring back higher levels of human moderation for removing content, after its AI-centred approach resulted in over-censorship, with videos being removed at far higher rates than with human moderators.


    Gap between rich and poor university students widest for 12 years

    Via The Canary.

    The gap between poor students and their more affluent peers attending university has widened to its largest point for 12 years, according to data published by the Department for Education (DfE).

    Better-off pupils are significantly more likely to go to university than their more disadvantaged peers. And the gap between the two groups – 18.8 percentage points – is the widest it’s been since 2006/07.

    The latest statistics show that 26.3% of pupils eligible for FSMs went on to university in 2018/19, compared with 45.1% of those who did not receive free meals. Only 12.7% of white British males who were eligible for FSMs went to university by the age of 19. The progression rate has fallen slightly for the first time since 2011/12, according to the DfE analysis.


    Quality Training

    From Raconteur. A recent report by global learning consultancy Kineo examined the learning intentions of 8,000 employees across 13 different industries. It found a huge gap between the quality of training offered and the needs of employees. Of those surveyed, 85 per cent said they , with only 16 per cent of employees finding the learning programmes offered by their employers effective.


    Other Pontydysgu Spaces

    • Pontydysgu on the Web

      pbwiki
      Our Wikispace for teaching and learning
      Sounds of the Bazaar Radio LIVE
      Join our Sounds of the Bazaar Facebook goup. Just click on the logo above.

      We will be at Online Educa Berlin 2015. See the info above. The stream URL to play in your application is Stream URL or go to our new stream webpage here SoB Stream Page.

  • Twitter

  • Recent Posts

  • Archives

  • Meta

  • Categories