Otwarty dostęp

A Local Adaptation in an Output-Based Research Support Scheme (OBRSS) at University College Dublin

 oraz    | 07 gru 2018

Zacytuj

Introduction

University College Dublin (UCD), Ireland’s largest university, has implemented the Output-Based Research Support Scheme (hereafter “OBRSS”) since 2016. Adapted from the Norwegian model, the OBRSS awards individual academic staff using a points system based on the number of publications and doctoral students. Hence, a major difference between the Norwegian model and the OBRSS is that the Norwegian model is designed to allocate block grants to universities (Schneider, 2009; Sivertsen, 2016), whereas the OBRSS aims to reward academic staff individually. It should also be noted that the OBRSS is implemented as a university initiative rather than a component of performance-based system as described in Hicks (2012) and Zacharewicz, Lepori, Reale and Jonkers (2018).

In this article, we will first describe the design and implementation processes of the OBRSS, including the creation of the ranked publication list and points system, as well as infrastructure requirements. Then, some results of the OBRSS will be presented, focusing on the coverage of publications reported in the OBRSS ranked publication list and Scopus.

Some data about spending will also be reported. Last, challenges such as the evaluation of the OBRSS in terms of fairness, transparency, and effectiveness will be discussed.

Design and Implementation Processes

The overarching objective of the OBRSS is to incentivise academic staff to publish research output in higher quality outlets. The principles of the OBRSS, defined at the outset, are as follows:

Fair—Academic staff should be actively involved in its creation and define its methodology.

Transparent—Metrics and data used in the scheme are based on accessible and reproducible data.

Easy to understand & implement—Academic staff can play a part in performance improvement.

Underpinned by the strategic objectives—The scheme reinforces the objectives of the University Strategy 2015–2020.

Rewards excellence—The scheme is designed to encourage research excellence.

The design involved the construction of a ranked publication list and a points system. The ranked publication list includes journals, conferences, and monographs; and the ranking is based on a basket of indicators including Norwegian Register for Scientific Journals, Series and Publishers (NSD—National Centre for Research Data, 2018), Danish BFI (Ministry of Higher Education and Science, 2018), Finnish Publication Forum (Federation of Finnish Learned Societies), SNIP (Source Normalised Impact Factor per Paper), and CiteScore. Academic staff from across the university are consulted in finalising the ranked publication list each year; they are also requested to update their publication records on the Current Research Information System (CRIS) for points to be calculated. Only publications with a status of 'Published' in the CRIS are included in the OBRSS. PhD supervision records are maintained in the institutional Student Information System (SIS).

The OBRSS uses the ranked publication list—one section for Publishers and another for Series (Journals, Book Series, and Conference Series)—as a reference for the calculation of points. Each publication is assigned one of two levels: level 1–Normal or level 2–Prestigious. Weighted scores are then applied to each publication. Similar to the Norwegian model, points are allocated for different types of publication as summarised in Table 1

Points allocation per publication type.

Publication typesPoints Level 1 ‘normal’Points Level 2 ‘prestigious’
Book58
Journals Article13
Book Chapter13
Conference Publication0.52
Edited Book13
Other Publication0.52
Published Report13

Figure 1

OBRSS points.

There is a consultation process to ensure that inputs from the academic staff are considered in finalising the ranked publication list. During the consultation period, academic staff can make recommendations to add/remove publications to/from the ranked publication list at the two levels. The suggestions and recommendations are reviewed by the Office of Research Administration. Considering the objectives and scope of the OBRSS, external panels are not used to review the ranked publication list.

Publication points are calculated for each academic staff’s publications in the CRIS over a three-year period (for example 2015–2017) using the following formula (Table 2):

Calculation of publication output point.

Publication output-points = B x C x F x N, where

B = Points (allocated based on the type of publications and whether it is in a ‘normal’ or ‘prestigious’ channel)

C = collaboration factor (multiply by 1.25 if there are any international authors on the paper)

F = UCD author factor (multiply by 0.7 if there are two UCD academic staff on the paper; multiply by 0.6 if there are three UCD academic staff on the paper; multiply by 0.5 if there are four or more UCD academic staff on the paper)

N = if the total number of authors on a paper exceeds 100, multiply the result by 0.1

The total publication points for an individual are equal to the sum of all the points for each of their publications in the three-year period. PhD supervision points are calculated by counting the number of PhD students supervised by each academic staff member in the current academic year. Two points are awarded for being a primary or a secondary supervisor. The maximum points for PhD supervision have been capped at 20. Publication and PhD supervision points are both worth €35.

All academic staff are automatically entered into the OBRSS each year. The total points that an academic staff has accumulated is communicated using a personalised points statement. Final points statements are issued to academic staff receiving an award in October each year. The minimum value threshold for a research award is €200. There is no maximum research award, but in the first two years of operation, the maximum award based on the maximum points for an individual author were between €10,000 and €15,000.

Awards may be used by the academic staff for research support, such as to cover travel expenses, office supplies, equipment, and laboratory supplies. Overall, approximately 1% of the total annual research budget for the university is allocated to the OBRSS.

Results from the First Two Years

The first two years of the OBRSS has provided some valuable data for understanding research activities in the university. The more complete record of publications is essential and helpful for deliberating research strategies, on the one hand, and the spending pattern gives insights into the type of activities and resources that academic staff consider important for supporting their research, on the other. The following sections present a comparison of the coverage of publications in Scopus and the OBRSS ranked publication list, changes of publications reported and of the ranked publications list in the first two years, as well as some data about spending so far.

Coverage of Scopus and OBRSS

One of the most significant outcomes of the implementation of the OBRSS is a more complete picture of publication records in University College Dublin. The number of academic staff updating their research profiles in the CRIS has increased each year. In the first year the OBRSS was implemented, 85% of academic staff updated their profiles as opposed to 75% over the previous three years.

Using the publication records in CRIS, we can compare the coverage of research outputs per School and College in Scopus and the OBRSS ranked publication list. Although many international university ranking organisations use either Scopus or Web of Science as a data source to evaluate the research performance of an institution, the data source only works well for STEM (Science, Technology, Engineering & Mathematics) disciplines where coverage can be as high as 94% (Physics). For Arts & Humanities disciplines, coverage of their outputs varies between 2% (Irish, Celtic Studies and Folklore) and 18% (English, Drama & Film).

Comparison of publications for academic staff only from 2013 to 2017 inclusive; Scopus data from SciVal 25 May 2018; CRIS data from UCD RMS Profiles 22 June 2018.

UCD School Name (Discipline)Scopus Total 2013–2017CRIS Total 2013–2017% Coverage in Scopus
Agriculture & Food Science8681,08580.0%
Archaeology7320236.1%
Architecture, Planning and Environmental Policy12758021.9%
Art History & Cultural Policy121418.5%
Biology & Environmental Science44857278.3%
Biomolecular & Biomedical Science46852289.7%
Biosystems and Food Engineering47066570.7%
Business4591,04743.8%
Chemical & Bioprocess Engineering25026694.0%
Chemistry39145286.5%
Civil Engineering20547043.6%
Classics5539.4%
Computer Science76791683.7%
Earth Sciences14939038.2%
Economics14718679.0%
Education8317846.6%
Electrical & Electronic Engineering73086084.9%
English, Drama & Film7841518.8%
Geography8632426.5%
History4029113.7%
Information & Communication Studies6114940.9%
Irish, Celtic Studies and Folklore31452.1%
Languages, Cultures and Linguistics5736715.5%
Law5249510.5%
Mathematics & Statistics46061774.6%
Mechanical & Materials Engineering47891952.0%
Medicine1,8672,45176.2%
Music71265.6%
Nursing, Midwifery & Health Systems24153445.1%
Philosophy9824639.8%
Physics1,3251,40394.4%
Politics & International Relations12432138.6%
Psychology27762244.5%
Public Health, Physiotherapy and Sports Science74797776.5%
Social Policy, Social Work and Social Justice12843729.3%
Sociology4626717.2%
Veterinary Medicine6261,09457.2%
Grand Total12,45320,78559.9%

In total the UCD CRIS system records approximately 4,000 publications records per year for academic staff as opposed to Scopus which records 2,500 per year, see Table 4 below:

Comparison of publications for academic staff only from 2013 to 2017 inclusive; Scopus data from SciVal 25 May 2018; CRIS data from UCD RMS Profiles 22 June 2018.

UCD College NameScopus Total 2013–2017CRIS Total 2013–2017% Coverage in Scopus
College of Arts & Humanities2281,55614.7%
College of Business4681,04644.7%
College of Engineering & Architecture2,3843,69164.6%
College of Health and Agricultural Sciences4,0176,02166.7%
College of Science3,9255,07577.3%
College of Social Sciences & Law1,2273,48735.2%
Grand Total12,24920,87658.7%

The distribution of publication output in the OBRSS shows that the coverage is consistent across Colleges, apart from the College of Science. The anomaly (Figure 2) is due to large volumes of papers produced in the School of Physics through international collaborations. It is possible to have up to 5,000 authors on single papers in Physics (Castelvecchi, 2015), leading to a small number of academic staff in the School of Physics producing a large volume of publications in both Prestigious Level 2 and Normal Level 1 channels.

Figure 2

Shares in percent of total output per College in the OBRSS categories: Prestigious Channel – Level 2; Normal Channel – Level 1; Not recognised in OBRSS publication list.

Trends in Publications and Research Activities

It is expected that the creation of the ranked publication list in OBRSS would provide some guidance for publication outlets. Whilst it is understood that the increase can be due to many factors and would require careful examination, the initial results from the first two years of the OBRSS show some evidence that academic staff are selecting to publish in higher ranked publication outlets (Table 5).

Number of publications per OBRSS category, per scheme year.

OBRSS categories2016 Scheme (from Publications 2013 to 2015)2017 Scheme (from Publications 2014 to 2016)Difference%Difference
Prestigious Channel – Level 24,2304,4442145.1%
Normal Channel – Level 14,2676,3232,05648.2%
Not recognised in OBRSS publication list4,5153,202-1,313-29.1%
Grand Total13,01213,969

In the second year of implementation, there was a small increase (5%) in the reported number of publications in Prestigious Level 2 publication channels, while an increase (48%) was noted in the Normal Level 1 publication channels. At the same time, there appeared to be less publishing activity (-29%) in channels that are not recognised by the OBRSS. While these figures are indicative, a trend cannot be established given, first, the OBRSS has only been in operation for two years, and second, it is likely that academic staff had altered the publications reported in CRIS in the subsequent year based on the outcome in the first year.

The comparability is also affected by the changes in Prestigious Level 2 and Normal Level 1 channels from 2016 to 2017. As can be seen in Tables 5 and 6 below, the number of Prestigious Level 2 channels were reduced while keeping the overall number of ranked publication channels approximately stable.

Number of ranked journals, conferences and book series channels per OBRSS category, per scheme year.

Journal List20162017Difference% Difference
Prestigious Channel – Level 24,4853,958–527–11.80%
Normal Channel – Level 138,54439,1285841.50%
Grand Total45,04545,103580.10%

Number of ranked publisher channels per OBRSS category, per scheme year.

Publisher list20162017Difference% Difference
Prestigious Channel – Level 2265257–8–3.00%
Normal Channel – Level 12,1902,200100.50%
Grand Total2,4552,45720.10%
Research Funding and Spending

Since the implementation in 2016, over €1.3m in new research funding was allocated to academic staff to support their research activities. The number of recipients and average award value have both increased from 2016 to 2017.

Figure 3

Number of award recipients per college and average award value.

Interestingly, over 50% of the awardees have not spent the research support fund at all. Of those who have used their funds, the expenses have been claimed for travel expenses, office supplies, equipment, and laboratory supplies:

41% of the funding was used to cover travels inside and outside EU. This included accommodation, transport and subsistence expenses;

25% of the funds spent were used to buy office supplies. Examples of the items purchased are: books, subscription to journals, staff training courses, website designs and copy-printing;

11% of the funds spent were used to buy equipment, such as PC’s, laptops, peripherals and laboratory or office furniture;

8% of the funds spent were used to purchase laboratory supplies, such as chemicals, parts, disposables, glassware & plastics and other general supplies.

Challenges

Implementing the OBRSS requires sound infrastructure, including the Current Research Information Systems, that supports reporting of publication coverage and trends, as well as spending pattern. Substantial resources were also needed to create and maintain the ranked publication list. In fact, the construction of the ranked publication list and points system are the basic steps to make OBRSS work. When compiling the publication list, suggestions and recommendations from academic staff are essential to gauge the completeness of the list as well as the appropriateness of the rank assigned. However, sometimes disciplinary differences are difficult to reconcile. For example, a publisher could be considered as prestigious in one discipline while it is considered normal in another. Some suggestions also fall into specialised areas where the publication is not indexed in Scopus and is not included in the Danish, Finnish, or Norwegian list. The decision about inclusion or exclusion of the publication could be taxing when balancing the credibility and fairness of the list. Nevertheless, as the ranked publication list is being updated every year, it is expected that the scope and the ranking will be adjusted to reflect quality, impact, and disciplinary norms.

Another challenge is to evaluate the effectiveness of the OBRSS pertaining to the objective of increasing publications in high quality outlets. The main reason is that many factors can contribute to different publication trends, for example, research areas of new staff members, national and EU funding priorities, and so on. The effectiveness of the OBRSS would be inconclusive despite it would certainly be a contributive factor in steering research outputs. Also, there is a risk that the OBRSS could result in a higher number of publications at the Normal Level 1 than the Prestigious Level 2, as Butler (2003, 2004) suggests in her studies of Australia. However, there is evidence from Norway that Level 2 publishing may increase as well (Schneider et al., 2015), which is just as likely since the amount of the award at UCD is rather small. The fact that less than 50% of the awardees has spent the funds is an indication that the award does not provide a strong enough incentive to game the system. It is expected that the ranked publication list would be seen as a guide of high quality publication channels and would alter preferences of outlets accordingly, whilst whether academic staff would be extrinsically motivated would need further investigation.

The OBRSS can also be used in ways unintended by the objectives of the scheme. As of now, the heads of school are provided with the points statement of academic staff in their unit. It is not clear as to whether and how the incentives might trickle down (Aagaard, 2015) or how the scheme might influence perception of university management and policy (Liefner, 2003; Woelert & Yates, 2015). There have been reports that the points have been used for self-evaluation as well as comparison by academic staff and heads of school. The “constitutive effects” (Dahler-Larsen, 2014) would also demand further investigation in the future.

Conclusion

This paper summarises the first two years of implementation of the OBRSS and some data about the coverage of publications and spending pattern. Whilst there have been questions concerning the fairness, transparency, and effectiveness and efficiency of the scheme, the OBRSS has also received encouraging and positive feedbacks. Since most funding schemes, both within the university and those offered by national and EU funding agencies, are competitive, faculty are appreciative of the reward of discretionary funds. The less positive responses of the scheme are largely dissatisfaction or disagreement with the ranked publication list. With the consultation process in place, it is hoped that the list will be updated to reflect quality, impact, and disciplinary norms. Research outputs and publications will be analysed regularly, with the understanding that the OBRSS factor would not be entirely conclusive.

It should also be noted that the points system of the OBRSS is not intended to be used as a tool, not least the sole criterion, for research assessment. Since the OBRSS points system does not represent the impact and quality of all kinds of research output, it does not necessarily reflect individuals’ research performance, particularly for those whose research outputs are more tailored and useful to local audience such as policy makers and businesses. An analysis of research outputs not included in the OBRSS ranked publication list would provide some insights into the notion of impact other than publications.

Nevertheless, the OBRSS has set an example of output-based support scheme in Ireland. Two other universities are currently considering implementing similar schemes. It is not yet known, however, whether they will adapt the Norwegian model when constructing the publication list and the points system.

Based on the experiences in the adaption of the Norwegian model (Aagaard, Bloch, & Schneider, 2015), it will be a few years before publication trends and effects of the OBRSS can be identified and analysed. Nevertheless, the implementation has provided insight into current publication patterns and preferences.

eISSN:
2543-683X
Język:
Angielski
Częstotliwość wydawania:
4 razy w roku
Dziedziny czasopisma:
Computer Sciences, Information Technology, Project Management, Databases and Data Mining