University of East London Homepage


About the REF

The Research Excellence Framework (REF) is the successor to the Research Assessment Exercise (RAE), which was first run in the UK in 1986 primarily as a means to collect information on which decisions about the allocation of funding to higher education institutions (HEIs) for research could be made. A further four RAE exercises followed - 1992, 1996, 2001 and 2008. Each successive exercise represented an evolution of the last, with an increasing burden - and hence an increasing cost - being placed upon the institutions submitting to it, on the experts who made up the peer review panels, and finally on HEFCE which was responsible for administering the process.

In March 2006, as part of the Budget, the Government published a discussion paper, Science and Innovation Strategy 2004-2014: next steps. This stated that "recognising some of the burdens imposed on universities by the existing Research Assessment Exercise (RAE), the Government has a firm presumption that after the 2008 RAE the system for assessing research quality and allocating “quality-related” (QR) research funding to universities from the Department for Education and Skills [then the Department for Innovation, Universities and Skills, now the Department for Business, Innovation and Skills] will be mainly metrics-based" and announced a consultation on this proposal.

The consulation, Reform of Higher Education Research Assessment and Funding, closed in October 2006. During the consultation process, for example, the Arts and Humanities Research Council and HEFCE established an expert group to examine the use of research metrics in the humanities, and various bodies with an interest in the proposals, not least HEIs themselves, had the opportunity to submit their views. The Government then announced, in the December 2006 Pre-Budget Report, a new framework for research assessment, which would be "a single system ... apply[ing] to all institutions and across all disciplines ... [with] differences in the applicability of current metrics across disciplines ... For science, engineering, technology (SET) and medicine a combination of research income, postgraduate research student data and a bibliometric indicator of quality will be used to assess research. The process will be overseen by seven advisory groups with representation from UK academics, research users and international advisors... For all other disciplines, including mathematics and statistics, there will be a significantly reduced, light-touch peer review process informed by a range of discipline-specific indicators. This will be substantially less onerous for universities." The intention at this stage was to undertake an assessment of SET subjects during the 2009-10 academic year, the results of which would be phased into funding allocations from September 2010; and for all other subjects, for an assessment to take place during the 2013-2014 academic year, to determine funding allocations from 2014.

HEFCE was charged by the government with developing these plans into a working process, and it spent much of 2007 developing its proposals, launching a Consultation on the Assessment and Funding of Higher Education Research post-2008 in November of that year. The key aims of what it named the Research Excellence Framework (REF) as outlined were "to produce robust UK-wide indicators of research excellence for all disciplines which can be used to benchmark quality against international standards and to drive our funding for research; to provide a basis for distributing funding primarily by reference to research excellence, and to fund excellent research in all its forms wherever it is found; to reduce significantly the administrative burden on institutions in comparison to the RAE; to avoid creating undesirable behavioural incentives; to promote equality and diversity; and to provide a stable framework for our continuing support of a world-leading research base within HE." As with the Government's consultation, the key stakeholders were able to offer their views on the proposals for consideration. The consultation closed in February 2008 and as previously a summary of the results was made available.

The outcome of the consultation encouraged HEFCE to modify its plans, which it outlined in a circular letter to institutions in May 2008. While development of the REF would continue, it was decided that there would be a "unified framework" covering all subjects, rather than a split between STEM and non-STEM disciplines; that this "unified framework ... would combine bibliometrics and other quantitative indicators with light-touch peer review within a variable geometry of assessment. Bibliometric indicators of research quality will be a key element in quality assessment wherever this is appropriate, with light-touch review of research outputs operating where it is not. In all cases we will use other indicators of quality and impact appropriate to the disciplinary area and determined after advice from expert panels." The timetable for the development and testing of the REF proposals would be extended by 12 months. This also meant a slight delay for the implementation of REF which would now " take place during calendar year 2013 to drive quality-related research funding from academic year 2014-15. For subjects where bibliometric indicators play a leading role in quality assessment, these will start to influence HEFCE funding phased in from 2011-12. A full exercise to produce bibliometric indicators of research quality will therefore take place, for appropriate subjects, in 2010."

The next stage was a pilot exercise designed to "explore which subjects should use bibliometric indicators under the new framework, assess which categories of staff and publications should be included in future bibliometric exercises; test the main sources of citation data and any requirements for cleaning the data before analysis (we will explore the use of both the Web of Science and SCOPUS); develop the process for collecting and managing bibliographic data; develop and test methods for analysing citations and benchmarking against international norms; identify our preferred means of constructing the indicator in the form of a citation profile, and how this can be used within the REF; and explore what supplementary information the process can usefully generate" ran from July to December 2008, involving 22 higher education institutions from across the UK, largely working with data collected for the RAE and augmented by further information collected directly from Web of Science and SCOPUS. An interim report on the pilot exercise was published in June 2009, and a final report on the bibliometrics pilot exercise was published in September.

The pilot exercise concluded that "bibliometrics are not sufficiently robust at this stage to be used formulaically or to replace expert review in the REF. However there is considerable scope for citation information to be used to inform expert review. The robustness of the bibliometrics varies across the fields of research covered by the pilot, lower levels of coverage decreasing the representativeness of the citation information. In areas where publication in journals is the main method of scholarly communication, bibliometrics are more representative of the research undertaken." Broadly speaking, the pilot demonstrated that in some science-based subjects, citation data was reliable enough to be a useful tool to aid peer review; but in many other subjects it was not, and thereby confirmed the value of peer review as the only real means to assess the quality of research. HEFCE's plans for "a sector-wide bibliometrics process in an appropriate range of subjects" in 2010, were therefore abandoned. Problems had already been recognised - HEFCE had acknowledged that "not all institutions will [...] have systems in place that will have systematically recorded all relevant research outputs" and the intention had been that "this first sector-wide bibliometrics process starting in 2010 will be developmental to some extent" and used only "to inform a small element of funding only if found to be sufficiently robust." The pilot results must therefore have been something of a relief.

While the bibliometrics pilot was underway, of course, the crisis in the financial sector came to a head and triggered the economic downturn. In this climate, it is not surprising that questions began to be asked about whether publicly-funded research undertaken in the UK was not merely of high quality, but was having impact - whether in economic, social, public policy, cultural, or quality of life terms - and therefore it was decided that the impact of research should also form part of the assessment. To this end. HEFCE began a pilot exercise on impact to test and inform an approach to the assessment of research impact.

In late September 2009, HEFCE published its proposals on all of the key features of the Research Excellence Framework, as part of its Second Consultation on the Assessment and Funding of Research. This suggested a framework in many ways much more similar to the 2008 Research Assessment Exercise than previous proposals, but with some significant differences, most intended to reduce the burden of the exercise. These proposals included simplification of staff categories, most notably the removal of Category C staff; supplying generic templates for narratives on the research environment and aligning data requirements with HESA etc to prevent institutions having to compile data more than once; and reducing the 67 units of assessment used under RAE 2008 significantly, proposing 30 new units amalgamating the old. They also proposed to allow unit of assessment panels to determine the use they would make of bibliometrics in informing their judgements; introduced the assessment of impact alongside outputs and environment, suggesting that impact might be assessed on the basis of the submitting department and relating to research carried out by it over the preceding ten years; and provided definitions of quality on the unclassified to 4* scale both overall and for the outputs and impact sub-profiles (but left panels to determine the environment sub-profile). HEIs and other stakeholder groups were invited to comment on the proposals, the consultation closing in mid-December 2009. The outcomes of that consultation were published in March 2010.

The outcome of that consultation broadly upheld the proposals HEFCE had made, but a significant minority of responses expressed concerns with the proposals for the assessment of impact (but not the principle of assessing impact itself). Following the 2010 General Election, David Willetts MP, the Universities Secretary, made it clear that he shared the concerns raised about the assessment of impact and, following discussions with HEFCE, announced in a speech at the Royal Institution on 9 July 2010 that the REF would be delayed by a further year, in order to allow for more time for the impact element of the assessment to be further considered and developed. Later in the same month, HEFCE began its recruitment process for UoA panel chairs and members, alongside confirmation of the reduction of UoA panels from 67 in RAE 2008 to 36 for the REF.

HEFCE published the report of the main panels involved in assessing impact during the pilot exercise in November 2010, which made a number of key points and recommendations. Impact , which retained a broad definition around social, economic, cultural, environmental, health and quality of life benefits, was thought to be assessable and therefore should be part of the REF. It was confirmed that impact purely within academic disciplines would remain outside the scope of the exercise, but additional ways to conceive of impact, around the benefits of public engagement activity, and about impact as changes that have come about that have benefited, or contributed to, economy and society. Impact should be actual, rather than anticipated or future impact, based on high quality research carried out up to fifteen years previously, though panels would have some ability to extend this period if appropriate to their discipline. Impact would be assessed through the submission of case studies, while the previously suggested impact summary was recast as a section of the environment element of the submission about institutional support for generating impact. A number of clarifications and improvements to the guidance and templates used for the pilot exercise were suggested, to help institutions put forward more meaningful examples.

Final decisions on the assessment of impact in the REF, made by the four Funding Councils for higher education, were published in March 2011. This endorsed and agreed the findings of the pilot exercise, and set the weighting of impact in the first REF at 20%, alongside outputs at 65% and environment at 15%. It also confirmed that the impact weighting would rise in future rounds of the REF.

HEFCE published general guidance on submissions on 14 July 2011, and produced draft panel criteria and working methods on 29July. After considering the results of a consultation process (which closed 5 October 2011) on the latter, final versions of the panel criteria and working methods were published on 30 January 2012.

The first REF assessment will take place during 2014, and a deadline for submissions of 29 November 2013 has been set.

More information

This is a potted history drawn from a number of government and HEFCE publications, many of which are linked to in the main text. For further information, see HEFCE's REF website.

This page will be periodically updated. Last updated January 2012. 

Latest News

HEFCE REF logo

29 January 2013 UEL’s Code of Practice on the Preparation of Submissions and Selection of Staff for REF 2014 is available here

More REF News and Updates

Navigation menus:

Site-wide menu


Information for screenreader users:

For a general description of these pages and an explanation of how they should work with screenreading equipment please follow this link: Link to general description

For further information on this web site’s accessibility features please follow this link: Link to accessibility information