This week marks the end of a seven-year hiatus as the results of the sixth and final Research Assessment Exercise (RAE) are due to be announced. The fate of academic careers and university finances hangs in the balance. For the higher education sector, Thursday’s announcement is as important as election night results, where careers can be made or lost overnight. The RAE has also been accused of causing misery for academics and distorting research and university life. In addition to revealing the quality of British research, the results will determine how a no-strings public funding of over £1.5 billion will be allocated to universities each year from 2009. A new and improved “Research Excellence Framework” (REF) is planned to replace the RAE in 2013. Despite attempts to stop them, universities have played games to win the research race. Some have excluded staff they thought would not score highly while others have sought the best researchers using transactions similar to the football transfer market. This week, they will find out if their gamble paid off, but funding decisions based on the results will not be made until March next year.
Panels of approximately 1,100 academics in 67 different subjects overseen by 15 main panels to ensure parity have spent the last year reviewing research papers to judge their peers’ work. The panel considered the quality of research, the environment in which it is produced, and the esteem in which researchers are held to evaluate the papers. However, the components considered in the overall score will not be made available until March. Research was rated using a scale of 4* (world-leading), 3* (internationally excellent), 2* (internationally recognised), 1* (nationally recognised), or 0 (sub-standard). Each university submitted a maximum of four pieces of research per full-time academic, which were ranked to reveal the percentages of research that were given each rating. The result was a “quality profile” of each university’s research strengths and its world-beating versus sub-standard proportion of work.
Observers expect more turbulence than during the previous exercise in league tables that newspapers, including The Guardian, will inevitably draw up. There are bound to be surprises. In contrast to the overall ranking of 5*, 5, 4, 3a, 3b, 2, or 1 given to departments in 2001, the quality profile produced on Thursday will highlight 1* and 2* research mingling with Nobel prize-winning work from the likes of Cambridge, Oxford, Manchester, or Warwick.
In general, more departments are expected to slump in the rankings than soar this time around, as the new marking system will reveal poor research. For example, the grade point average of profiles produced by the results will see institutions with many departments on the borderline of 5* last time drop down. Vice-chancellors are already busy speculating on who might overtake whom in the results due on Thursday. Some suspect Imperial College London will cede its place in the rankings to University College London. There will also be high interest in how the University of Manchester performs after its 2003 merger, two years after the last RAE.
Despite the anticipations for Thursday’s results, the real story will come in March when funding allocations based on them are announced. Funding will be handed out per a submitted full-time equivalent member of staff. For example, if one-half of a university’s physics research is rated 4*, one-quarter 3*, and the last quarter is 2*, funding will be distributed proportionally rather than on a department’s overall ranking, as it was in 2001. Last time, winning an overall 5* rating for a department required only half of staff to be 5*, but because it achieved 5*, the department got the funding.
Last month, there was controversy over the Higher Education Statistics Agency (HESA) abandoning their data collection, which would have revealed universities’ efforts in concealing staff in the RAE. The reason for this was due to the mismatching definitions used by both HESA and Hefce regarding the eligibility of who qualified for the data collection.
To compensate, our tables aim to provide a proxy measurement of the "intensity" of research based on the latest data collected on staff figures submitted to HESA by universities. This gives an idea of which universities can truly claim they are leading in research, and which ones may have weaknesses or be hiding something.
The current RAE system is being replaced in 2013 by the Ref system. The Ref system entails funding allocations based on the results, and metrics instead of panel reviews by academics. The idea behind this is to reduce the vast amount of time universities spend preparing for and carrying out the RAE. Instead, Ref will rely primarily on evaluating the number of research papers published and their impact, as well as the number of research students and grants departments manage to obtain.
However, for judging arts, humanities research, and social sciences, there will still be peer reviews involved. Therefore, although the RAE system is ending, it is being replaced by a new system that aims to be more efficient in evaluating and distributing funding accordingly.