/programs/ourwork/researchinfo/workflows/default.htm

originally: http://www.oclc.org/programs/ourwork/researchinfo/workflows/default.htm

Workflows in Research Assessment Program

Total projects: 1


Problem statement:

"The essential game of each university is the reputation game; the essential game of each researcher is competition."
— Professor Sijbolt Noorda (Chairman, VSNU—the Dutch Association of Universities). LIBER Conference presentation, Istanbul, July 2008.

Universities proclaim their dedication to excellence. It is almost impossible to find a university that does not do this, and the evidence is often provided very visibly within mission statements. In a competitive environment for higher education, these claims demand to be tested, and the measures which are used are themselves becoming more stringent, and more methodologically robust. Gaining an excellent reputation, either across an entire institution or for one or more disciplines—is difficult, and often the product of decades or even centuries of effort. Finding new niches in which to build reputation is an objective of institutions which are either seeking to increase their overall reputation, or are seeking to increase their income through improved reputation.

This concern with institutional reputation is mirrored in a concern with individual academic reputation by researchers themselves. Careers are built upon reputation, acquired through peer respect for one's work as shown in the award of prizes, leadership roles within a discipline and, of course, publications in the best venues. Individual reputation fuels corporate reputation.

Recently, in countries in which the accountability of public expenditure on research has become a high-profile matter (such as the UK and Australia), public frameworks for the assessment of higher education research have emerged. These rely, among other measures, on some evaluation of the quality of published output. Mainly due to the high cost of sustaining these frameworks on the basis of peer review of significant samples of research outputs produced by large numbers of institutions, there is—controversially—discussion about moving parts of this evaluation from a peer-review process to one based on metrics (e.g. citation analysis). Whatever balance obtains, research libraries will potentially have critical roles to play in compiling evidence for the measurement regimes which emerge.

At the same time, the environment in which research outputs are developed, delivered, assessed and preserved has undergone significant change due to the ability of the internet to widen the scope of research publishing—by making it relatively simple both for researchers to publish their own outputs, and for research funders to demand visibility of those outputs. Open Access publication of research outputs in conjunction with their publication in peer-reviewed journals has become an issue not merely of efficiency in research lifecycle management, but also of political and economic import in a system in which research is financed largely from public monies.

Two competing pressures are therefore being exerted upon the bodies within universities which record and preserve their research profiles. The first is the pressure to live up to claims of excellence in validly measurable ways which permit education 'consumers' to make informed choices between institutions, subject departments and research groups. The other is the pressure to report findings and publish results—in the form of research articles (even if only preprints of commercially published articles) instantaneously. These pressures conflict with each other in ways that expose the roles and missions of different agents within institutions.

The library is one of these agents, expected on the one hand to work with administrative units such as central and College- or School-level research offices to compile accurate and effective research records (embracing publications alongside inputs such as research student numbers and grant funds awarded) that deliver institutions competitive advantage in the 'game' of building and maintaining reputation, and on the other to provide tools for rapid research output development and publication in an open access environment.

Good practice has not emerged internationally in this area, although some countries have developed systems which have suited their own research funding regimes, but which are likely to come under review as the competition for reputation intensifies internationally. For libraries, there are some fundamental understandings and clarifications which still need to be reached. Libraries have no clear or well-understood way of describing this set of related issues yet, let alone of identifying their ramifications and considering the responses which they should make individually and collectively.

Impact: This is a developing area where some early state-of-the-art work could have useful impact, and should suggest further lines of work. It is an area of diverse practice, but with considerable opportunity for libraries, we believe. Furthermore, it is an area where a comparative perspective would be interesting given emerging national research assessment frameworks.

Projects

For more information

John MacColl
European Director, RLG Programs
john_maccoll@oclc.org