GoogleTranslate Service


PLE2010 – reflections on the review process

April 25th, 2010 by Graham Attwell

A quick update in my series of posts on our experiences in organising the PLE2010 conference. We received 82 proposals for the conference – far more than we had expected. The strong response, I suspect, was due to three reasons: the interest in PLEs in the Technology Enhanced Learning community, the attraction of Barcelona as a venue and our success in using applications like Twitter for virally publicising the conference.

Having said that – in terms of format in seems to me that some of the submissions as full conference papers would have been better made under other formats. However, present university funding requirements demand full papers and inhibit applications for work in progress or developing ideas in more appropriate formats.

For the last two weeks I have been organising the review process. We promised that each submission would be blind reviewed by at least two reviewers. For this we are reliant on the freely given time and energy of our Academic Committee. And whilst reviewing can be a learning process in itself it is time consuming.

Submissions have been managed through th open source Easychair system, hosted by the University of Manchester. The system is powerful, but the interfaces are far from transparent and the help somewhat minimalist! I have struggled to get the settings in the system right and some functions seem buggy – for instance the function to show missing reviews seems not to be working.

Two lessons for the future seem immediately apparent. Firstly, we set the length of abstracts as a maximum of 350 words. Many of the reviewers have commented that this is too short to judge the quality of the submission.

Secondly is the fraught issue of criteria for the reviews. We produced detailed guidelines for submissions based on the Creative Commons licensed Alt-C guidelines.

The criteria were:

  • Relevance to the themes of the conference although this does not exclude other high quality proposals.
  • Contribution to scholarship and research into the use of PLEs for learning.
  • Reference to the characteristics and needs of learners.
  • Contribution to the development of learning technology policy or theory in education.
  • Links that are made between theory, evidence and practice.
  • Appropriate reflection and evaluation.
  • Clarity and coherence.
  • Usefulness to conference participants.

However, when I sent out the papers for review, whilst I provided a link to those guidelines, I failed to copy them into the text of the emails asking for reviews. In retrospect, I should have attempted to produce a review template in EasyChair incorporating the guidelines.

Even with such explicit guidelines, there is considerable room for different interpretation by reviewers. I am not sure that in our community we have a common understanding of what might be relevant to the themes of the conference or a contribution to scholarship and research into the use of PLEs for learning. I suspect this is the same for many conferences: however, the issue may be more problematic in an emergent area of education and technology practice.

We also set a scale for scoring proposals:

  • 3 – strong accept
  • 2 – accept
  • 1- weak accept
  • 0 – borderline
  • -1 – week reject
  • -2 – reject
  • – 3 – reject

In addition we asked reviewers to state their degree of confidence in their review ranging from 4, expert, to 0, null.

In over half the cases where we have received two reviews, the variation between the reviewers is no more that 1. But there are also a number of reviews with significant variation. This suggest significant differences in understandings by reviewers of the criteria – or the meaning of the criteria. it could also just be that different reviewers have different standards.

In any case, we will organise a further review procedure for those submissions where there are significant differences. But I wonder if the scoring process is the best approach. To have no scoring seems to be a way fo avoiding the issue. I wonder if we should have scoring for each criteria, although this would make the review process even more complicated.

I would welcome any comments on this. Whilst too late for this conference, as a community we are reliant on peer review as a quality process and collective learning and reflection may be a way of improving our work.

3 Responses to “PLE2010 – reflections on the review process”

  1. A couple of thoughts, as someone who did submit reviews (I know you’ve touched on some of this; I’m just offering comments, as requested).

    – the stated criteria, as listed in the post above, are actually longer than many of the abstract submissions. As such, the criteria were overkill for what was actually being evaluated.

    – the criteria do not reflect academic merit. They are more like a check-off list that a a non-skilled intake worker could complete. The purpose of having academics do the review is that the academics can evaluate the work on its own merit, not against a check-off list.

    – the criteria reflect a specific theoretical perspective on the subject matter which is at odds with the subject matter. They reflect an instructivist perspective, and a theory-based (universalists, abstractivist) perspective. Personal learning environments are exactly the opposite of that.

    In other words, it is not appropriate to ask academic reviewers to bring their expertise the material, and to then neuter that expertise with overly perspective statement of criteria.

  2. Graham Attwell says:

    Thanks Stephen. I agree with your comments but still worry about how we get agreed meanings and standards for reviewers. Maybe we should be facilitating some kind of ongoing discussions between reviewers as part of the process?

Tweetbacks/Trackbacks/Pingbacks

  1. […] were used by some reviewers. Others disagreed with this approach. Stephen Downes, commenting on my last blog post about the conference, […]

  • Search Pontydysgu.org

    Social Media




    News Bites

    Graduate Jobs

    As reported by WONKHE, a survey of 1,200 final year students conducted by Prospects in the UK found that 29 per cent have lost their jobs, and 26 per cent have lost internships, while 28 per cent have had their graduate job offer deferred or rescinded. 47 per cent of finalists are considering postgraduate study, and 29 per cent are considering making a career change. Not surprisingly, the majority feel negative about their future careers, with 83 per cent reporting a loss of motivation and 82 per cent saying they feel disconnected from employers


    Post-Covid ed-tech strategy

    The UK Ufi VocTech Trust are supporting the Association of Colleges to ensure colleges are supported to collectively overcome challenges to delivering online provision at scale. Over the course of the next few months, AoC will carry out research into colleges’ current capacity to enable high quality distance learning. Findings from the research will be used to create a post-Covid ed-tech strategy for the college sector.

    With colleges closed for most face-to-face delivery and almost 100% of provision now being delivered online, the Ufi says, learners will require online content and services that are sustainable, collective and accessible. To ensure no one is disadvantaged or left behind due to the crisis, this important work will contribute to supporting businesses to transform and upskilling and reskilling those out of work or furloughed.


    Erasmus+

    The European Commission has published an annual report of the Erasmus+ programme in 2018. During that time the programme funded more than 23,500 projects and supported the mobility of over 850,00 students, of which 28,247 were involved in UK higher education projects, though only one third of these were UK students studying abroad while the remainder were EU students studying in the UK. The UK also sent 3,439 HE staff to teach or train abroad and received 4,970 staff from elsewhere in the EU.


    Skills Gaps

    A new report by the Learning and Work Institute for the Local Government Association (LGA) finds that by 2030 there could be a deficit of 2.5 million highly-skilled workers. The report, Local Skills Deficits and Spare Capacity, models potential skills gaps in eight English localities, and forecasts an oversupply of low- and intermediate -skilled workers by 2030. The LGA is calling on the government to devolve the various national skills, retraining and employment schemes to local areas. (via WONKHE)


    Other Pontydysgu Spaces

    • Pontydysgu on the Web

      pbwiki
      Our Wikispace for teaching and learning
      Sounds of the Bazaar Radio LIVE
      Join our Sounds of the Bazaar Facebook goup. Just click on the logo above.

      We will be at Online Educa Berlin 2015. See the info above. The stream URL to play in your application is Stream URL or go to our new stream webpage here SoB Stream Page.

  • Twitter

  • RT @_lesliethomas Cambridge University Press are offering complete books to download on race, protests & civil rights. These are available for free until 12 July 2020. There is quite a selection. Check out retweet & forward to those you think would benefit / be interested cambridge.org/core/what-we-p…

    About 3 days ago from Cristina Costa's Twitter via Twitter for Android

  • Recent Posts

  • Archives

  • Meta

  • Categories