Our successes

Methodology for evaluation

Methodology for MRC portfolio analysis 

The MRC reports widely on the breadth and outcomes of MRC-funded research. Information on the MRC award portfolio and its impact appears in government and other publicly available reports, news articles, our blog and many other MRC publications. In this section, we explain the methodologies and approaches we use to collect, assess and present data about MRC-funded programmes and their outputs. In general, the majority of data analysis methodology follows similar approaches using two main datasets:

  1. Grant management data – all grant and financial data on awards made by the MRC is accessed via dedicated grant management systems. All requests for awards made via a specific board/panel/scheme/initiative, including all expenditure and commitment data, is handled via MRC information and analysis team
  2. researchfish® – the majority of data on outputs and impact assessment are obtained via MRC researchfish® database

In addition, the MRC gather information for analysis or validation from other sources, including citations and PhD students.

The MRC uses a range of classification systems, such the Health Research Classification System (HRCS), to define portfolios of awards associated with certain research themes or areas within the larger ‘all MRC’ portfolio.

Back to contents

Portfolio analysis

In general, our reports focus on the MRC as a whole: reporting across all MRC-funded research since 2006 or in a given period. However we may also wish to report on specific research topics or dedicated award schemes. In these cases, we provide quantitative data from this sub-set of awards, which is commonly referred to as an award portfolio. Portfolios are defined by disease area, by academic field or by scheme objective through use of bespoke search criteria. These can be MESH or other text analysis terms or the UKCRC Health Research Classification System (HRCS).

Health Research Classification System (HRCS)

The MRC uses the Health Research Classification System (HRCS) to classify the main focus of all awards. The HRCS follows a two dimensional framework, with coding from both HRCS dimensions are applied to each award:

In the total MRC portfolio, the majority (60-70 per cent) of the awards are classified as “discovery science” by their HRCS coding to Aetiology and/or Underpinning.

Most of the remainder of the awards are focused on research into Treatment Development, Detection and Diagnosis, Treatment Evaluation or Prevention.

Portfolios of MRC awards can be identified by HRCS classification (e.g. all awards associated with a given ‘research activity’) or by MRC funding initiative (e.g. Stratified Medicine) or by combining the two grouping systems (e.g. Prevention and NPRI awards).

Back to contents

researchfish® 

Since 2009 researchers receiving funding from any MRC initiative are required to annually report all research outcomes and outputs to researchfish®. All data in researchfish® is self-reported, based on a standardised question set 

researchers are asked to complete. Outputs and outcomes include publications and academic collaborations, new products, tools and technologies that advance understanding, engagement activities and evidence for any policy influence arising from research results. Researchers are also expected to report on economic impacts that may arise, such as obtaining further funding for continued research, commercialising intellectual property and establishing spin-outs. As different outputs and impact take differing periods of time to develop, researchers are required to continue to report outputs for a minimum of five years from the end date of the date and requested to continue reporting longer if new outcomes arise that are associated with the research.

We analyse the researchfish® data from portfolios at both the individual grant level (e.g. the number of instances an output was reported for each grant) and at the level of the portfolio (e.g. publications per year) using distinct methodologies. Researchers provide details of the outputs arising from their work and attribute each output to the award from which it arose. However a single output, for example a publication or a new collaboration, might have arisen from more than one award and are therefore attributed to multiple awards. Such multiple attributions can lead to duplication when assessing aggregated data. These outputs are de-duplicated, as much as possible, by the type of outputs generated, in aggregated analyses at portfolio level. Usually de-duplication is done using system-generated codes to indicate when a researcher has attributed an output to more than one grant. This may not de-duplicate outputs if different researchers enter similar information independently of one another. Supplemental information is used to de-duplicate where available. For example PubMed identification numbers or Digital Object Identifiers (DOIs) can be used to generate unique sets of records for publications while the details of duration, amount of money and funding organisation are used to de-duplicate reports of further funding. In our reporting, the aggregated de-duplicated outputs are sometimes referred to as number of ‘unique’ outputs.

In addition to de-duplication, we also conduct data validation to ensure researchfish® data are as robust as possible. For example, outputs are removed from analyses if the researcher indicated that they occurred before the start of the funding for their award. When reviewing time-based analysis, it important to note that researchers are asked to indicate only the year (as opposed to the month and the year) of their outputs. As the grant may begin at any point during the year, outcomes identified as occurring within the first year of the grant may have occurred within one month of its start date if the grant begins in December or within almost two years if the grant begins in January and the outcome reported does not occur until December of the following year (e.g. January 2012 – December 2013). Outputs are removed from analysis if the researcher indicates that they occurred before the start of the funding for their grant. The MRC also performs further data validation exercises throughout the year, for example to confirm reports of spin-outs truly reflect new start-up companies rather than new commercial ventures with established biotech firms. However it is important to note that all data in researchfish® is self-reported and given the thousands of new outputs reported each year, we cannot validate every reported output.

Researchers can enter, amend and update information in researchfish® all year round, but the MRC requires researchers to submit a return in the system once a year to confirm their output data are up to date and accurate. However this continuous, retrospective reporting process means that reported outcome figures for any given period may change as researchers continue to ‘back-fill’ information. Hence the research output data from previous years will become increasingly comprehensive as time passes and more reports are added.

It is also important to note that there will some variations in analysis between reporting periods, as the modifications to the researchfish® question sets, data processing/cleaning (de-duplication, disambiguation etc.) and changes in coding practice will affect some data outputs. Therefore, researchfish® data presented in one MRC report may differ in the precise figures reported in one from a previous year.

Back to contents

Analysis of researchfish® outputs

Standard approaches to assessing researchfish® data

The reports from researchfish® are used across a number of different MRC reports and while there will be some differences in how that data are displayed, there are a number of key principles which remain constant across all reports:

  • All figures reported are validated (if possible) and, when using aggregated data, we present ‘unique’ numbers of de-duplicated outputs to best reflect the available data.
  • Percentages generated from researchfish® data for MRC reports are rounded up or down to the nearest whole number. Therefore, some may appear as zero if this represents less than half of one per cent, and not all tables may sum to 100 per cent because of rounding.
  • Where instance of further funding are reported in currencies other than Pounds Sterling the values are converted using an average exchange rate for each calendar year as reported on OANDA.
  • ‘New’ awards refer to new commitments made within the specified period, whereas ‘active’ awards refers to all awards still active (i.e. incurring spend) within the specified period.

Analysis of individual output types

Publications

In the grant level analysis, where more than one grant claims to have contributed to a publication, each is credited equally in the analysis. In this case, several thousand publications are counted multiple times in the All MRC portfolio. In the analysis of the aggregated outputs of the portfolio, the publications have been de-duplicated through the use of unique identifiers (PubMed ID, DOI or other identifier) for each publication.

It is important to note that researchers are asked to indicate only the year (as opposed to the month and the year) of their outputs. As the grant may begin at any point during the year, outcomes identified as occurring within the first year of the grant may have occurred within one month of its start date if the grant begins in December or within almost two years if the grant begins in January and the outcome reported does not occur until December of the following year (e.g. January 2012 – December 2013).

Open access: The MRC has an open access policy that requires electronic copies of any research papers, reviews, or conference proceedings that have been accepted for publication in a peer-reviewed journal to be made freely available from Europe PubMed Central. This policy applies to research that is supported in whole or in part by MRC funding. All deposited publications must be made freely accessible from Europe PMC as soon as possible or at least within six months of the journal publisher’s official date of final publication.

Due to time lags in publishing, ID assignment and Europe PMC processing, one would expect lower absolute numbers of publications and proportional compliance in the most recent year, and that these would increase with the next data gathering period.

We will work with Europe PMC to obtain further information about whether these papers were openly accessible within six months of publication, and to filter our results with respect to publication types that have to comply with the open access policy.

Collaboration

Principal Investigators were asked to provide information about their collaborators. These responses were then coded for the country and sector (public, private, etc.) of the collaborator to allow an analysis of the number of international MRC Collaborations. The frequency is indicative of collaborations, not collaborators; if three different MRC researchers report collaborations that would be counted three times even if the partner were the same in all collaborations.

Further funding

Different areas of science have different costs associated with them and therefore both the scale and diversity of external funding are of interest. To accommodate these two factors, the analysis is broken down into two parts: instances of Further Funding and value of that funding.

If a Principle Investigator reported two instances of Further Funding with the same funder, for the same amount, with the same start and end dates, it was assumed that this was double reporting and only one of those two instances was counted. Many researchers reported receiving further funding from the MRC but upon further investigation, the overwhelming number of these reports related to the project’s core funding rather than Further Funding. Therefore, all instances of further funding reported as having been made by the MRC were excluded from this analysis, including renewal of the MRC grants included in this analysis.

The estimated amount of Further Funding spent is calculated using the total amount of Further Funding reported and pro-rating the funding for the period covered by DGP8 (2006-2016) and when the grant was active. For example, if an investigator reported £100,000 of funding from 1 April 2015 until 1 April 2017, it is estimated that 50 per cent of this grant, or £50,000, will have been spent by January 2016.

The Further Funding by country of funder is represented on the maps by circles and each circle’s size represents the amount for Further Funding reported from each particular country. Global Further Funding sources identified as Global are also listed. The scale is noted at the bottom left of each map.

Next destination

The next destination section requests that the Principle Investigator of a research project report on the next employment role and the sector of the next employer of team members on leaving the research group.

Research materials

Research materials include databases, data analysis techniques, cell lines, in vitro and in vivo models of mechanisms or symptoms, and new equipment created as a result of research. Such materials have considerable potential for re-use by other researchers in future applications and are therefore a highly beneficial output of MRC-funded research.

These outputs were reported as a single output type in researchfish® until 2013. All products were ‘Research Materials’ until 2013 when the new subdivisions of ‘Databases and Models’, ‘Software and Technical Products’, and ‘Tools and Methods’ were created.

Back to contents

Bibliometric / Citation data

Publication outputs reported in researchfish are verified through PubMed Central and the aggregated list of unique publication identification numbers are sent for bibliometric analysis to a commercial provider.  Citation impact data, normalised by year and subject area, for all publications associated with MRC funding have historically been provided by either Elsevier or Clarivate Analytics.

Elsevier provided a FWCI (field-weighted citation index) for each publication reported by an MRC researcher. The FWCI divides the number of citations received by a publication by the average number of citations received by publications in the same field, of the same type, and published in the same year. The indicator is always defined with a world average baseline of 1.0. For the calculation of subject area, or field of research, for the FWCI, a scheme encompassing more than 300 subjects based on Scopus journal classification has been used.

Clarivate provided a field-normalised citation impact (NCIf) score for each publication reported by an MRC researcher. The NCIf accounts for both the field of research and the year of publication in the analysis. Therefore, a NCIf score of 1 is considered the global average for publications in a given field and year.

Back to contents

Student data

Records of MRC studentships on the Joint Electronic Submission (Je-S) database are provided directly by research organisations receiving MRC studentship programme funding. Please note that these data include MRC Advance Course Masters, Doctoral Training Partnerships (DTPs) and CASE PhD studentships, but may not include intramural and limited Centre studentships.

Research councils obtain submission data on students via an annual submission survey completed by the student’s host research organisation. See the Je-S handbook on PhD submission data for more details. Students on research council studentships are encouraged to complete their studies by an expected submission date. At MRC, expected submission is defined as ‘no more than six months after the funding end date’, with the typical duration of studentship funding between three to four years. However submission of a thesis can also be affected by career breaks, changes in research direction, changes in supervisory arrangements and other situations outside of the student or research organisation’s control. As such the submitting research organisation can adjusted the expected submission date to accommodate such changes.

The Higher Education Funding Council for England (HEFCE)’s Destinations of Leavers from Higher Education (DLHE) survey provides all research councils with information on where our PhD students move on to. The DLHE data provide information about patterns of employment and further study for all graduates six months after they complete their studies. This survey is a condition of funding for HEFCE-supported higher education institutions (HEIs) in England, which individual HEIs must fund and administer themselves, using materials provided by the Higher Education Statistics Agency (HESA). As such the data provided by HESA to all research councils on their PhD students is limited to those who successfully completed the survey request, so may not account for all studentships in our portfolio.

Back to contents