Sunday, November 07, 2010

Evaluating Research by Publication

Last month I provided some notes from a forum on measuring research output at the Australian National University in Canberra (October 18, 2010). The transcripts of the presentations are now available:
  1. Overview: Research Excellence Evaluation, Colin Steele, Emeritus Fellow, ANU.
  2. The Australian Research Council -ERA National Perspectives, Andrew Calder, Director, Research Performance and Analysis, ARC.
  3. The ANU Experience, John Wellard, Director ANU Research Office.
  4. Perverse and Beneficial Outcomes From Research Assessment, Professor Andrew Cockburn, Director, College of Medicine, Biology and Environment, ANU.
  5. Scholarly Publishing Economics: an International Research Perspective, Professor John Houghton, Victoria University.
  6. Future Publishing Metrics – The Way Forward?, Dr Danny Kingsley, Manager Scholarly Communications, ANU.
  7. Research Impact – the Wider Dimension, Dr Claire Donovan, Lecturer in Sociology, ANU.
Also Colin Steele has pointed out that the event "RESEARCH ASSESSMENT AND PUBLICATION METRICS - THE BEGINNING OR END OF AN ERA?", was organised by the ANU Emeritus Faculty, not the university overall, as I suggested.

Most of the "transcripts" are simply text, but Andrew Calder's "The Australian Research Council -ERA National Perspectives" is a set of slides. Here is a text version of the slides. Note that the FoR codes referred to are the Australian and New Zealand Standard Research Classification (ANZSRC) and "Scopus" is a list of journal articles and citations:



Excellence in Research for Australia

Andrew Calder

Director – Research Performance and Analysis

Australian Research Council

Australian Government


General ERA Principles

  1. Unit of Evaluation is the four‐digit ANZSRC Field of Research code (ie. 157 possible Units of Evaluation); evaluation occurs at the two‐digit level as well
  2. Evaluation by Research Evaluation Committees in discipline clusters; eight clusters in total
  3. There is a minimum level of output for a discipline to be considered ‘research active’ for evaluation in ERA
  4. Evaluations informed by a ‘dashboard ‘ of discipline‐specific indicators
  5. Some peer review of outputs accessed through institutional repositories in some clusters


ERA Process Overview

Metrics Profile 1Metrics Profile 2
Metrics Profile 3Metrics Profile 4
Metrics Profile 5Metrics Profile 6
Peer Review (if included)

Research Evaluation Committee

Final report

Note - There are no weightings!



Mythbusting ‐ Ranked Outlets

  • Only one of a number of unweighted
  • indicators on the “Dashboard”
  • Ranked conferences are essential for ICT,
  • Engineering & Built Environment
  • Note discipline‐specific practices


Role of the ranked outlets list in ERA

  • Ranked journals
  • Ranked conferences
    • FoR assignments
    • Ranks used for ranked journal/conference profile
    • FoR assignments used to derive discipline specific benchmarks for citation analysis

Developing the ranked journal list

Initial development by Learned Academies and Peak bodies

Public consultation (June‐Aug 2008)

Expert Review of the public feedback

Omitted journal public feedback (Aug‐Nov 2009)

Final expert review of consolidated list (Dec 2009 – Jan 2010)


Parameters for inclusion in the lists

  • academic/scholarly
  • publishes original research
  • peer reviewed or equivalent process
  • active during the ERA reference period (2003‐2008)
  • has an ISSN
  • able to withstand international scrutiny

The 2010 Ranked Outlet Lists

  • The 2010 lists are fixed for this ERA evaluation
  • ERA 2010 Ranked Journal List
    • 20,712 journals included
    • Over 88% agreement on ranks and FoR
  • ERA 2010 Ranked Conference List
    • 1,952 conferences included

ERA 2010

Background Statement

Volume and Activity
  • Staffing Profile
  • Research Outputs
  • Journal articles and listed conferences are apportioned against the FoR codes of the ERA Journal and Conference Lists.
  • All other outputs and eligible researchers can be assigned and apportioned to up to 3 four‐digit FoR codes of the institution’s choice.
  • Staff must be employed on the census date to count towards ‘FTE’.
Ranked Outlets
  • Ranked Journals
  • Ranked Conferences
  • All disciplines use Ranked Journals as an indicator; only some FoRs use Ranked Conferences.
  • Where Ranked Conferences is applicable, institutions must apportion against the FoR codes of the ERA Conference list.
  • Where an output of a listed conference does not belong to the listed FoR, institutions can submit the output as a non‐listed conference and assign up to three FoRs of their choice.
Citation Analysis
  • Relative Citation Impact (RCI) against world and Australian benchmarks
  • RCI Classes
  • Centile Profile
  • Only applies to journal articles
  • Low volume threshold is 50 apportioned indexed journal articles.
  • The citation supplier for 2010 is Scopus.
  • The citation census date is 1 March 2010.
Peer Review
  • Peer review
  • Applies to a range of outputs including journal articles, books, book chapters, creative outputs, etc.
  • Low volume threshold is 30 apportioned outputs (any type).
  • Institutions nominate 20% of total output for a FoR for peer review.
Esteem Measures
  • Editor of prestigious works of reference
  • Recipient of, Category 1 Fellowship or Australia Council Grant or Fellowship
  • Membership of, statutory committee and Learned Academy
  • Institutions can select up to 3 four‐digit FoRs (apportioned) for each individual esteem measure.
  • Esteem must be linked to a listed staff of the institution.
  • Individuals researcher cannot be identified through the esteem measures.
Research Income
  • Category 1‐4
  • Institutions can select as many four‐digit FoRs (apportioned) for each income item submitted.
  • Number of grants is collected for Category 1 income only.
  • FTE is used as a denominator for all Categories.
  • Category 3 income is disaggregated into the 3 subcategories (Australian, International A and B).
Applied measures
  • Patents
  • Commercialisation Income
  • Plant Breeder’s Rights
  • NHMRC Endorsed Guidelines
  • Registered Designs
  • Institutions can select up to 3 four‐digit FoRs (apportioned) for each applied measure submitted, except for commercialisation income where there are no limit on the number of four‐digit FoR codes submitted.
  • Applied measures are linked to the institution, not the individual.

Questions



No comments: