14 results for Anderson, David L.

  • Ranking economics departments in terms of residual productivity: New Zealand economics departments, 2000‐2006

    Anderson, David L.; Tressler, John (2009-03)

    Working or discussion paper
    University of Waikato

    This paper considers a new approach for ranking the research productivity of academic departments. Our approach provides rankings in terms of residual research output after controlling for the key characteristics of each department’s academic staff. More specifically, we estimate residual research output rankings for all of New Zealand’s economics departments based on their publication performance over the 2000 to 2006 period. We do so after taking into account the following characteristics of each department’s academic staff: gender, experience, seniority, academic credentials, and academic rank. The paper concludes with a comparison of rankings generated by the residual research approach with those generated by traditional approaches to research rankings.

    View record details
  • The relevance of the ‘h’ and ‘g’ index to economics in the context of a nation-wide research evaluation scheme: The New Zealand case

    Anderson, David L.; Tressler, John (2012-04)

    Working or discussion paper
    University of Waikato

    The purpose of this paper is to explore the relevance of the citation-based ‘h’ and ‘g’ indexes as a means for measuring research output in economics. This study is unique in that it is the first to utilize the ‘h’ and ‘g’ indexes in the context of a time limited evaluation period and to provide comprehensive coverage of all academic economists in all university-based economics departments within a nation state. For illustration purposes we have selected New Zealand’s Performance Based Research Fund (PBRF) as our evaluation scheme. In order to provide a frame of reference for ‘h’ and ‘g’ index output measures, we have also estimated research output using a number of journal-based weighting schemes. In general, our findings suggest that ‘h’ and ‘g’ index scores are strongly associated with low-powered journal ranking schemes and weakly associated with high powered journal weighting schemes. More specifically, we found the ‘h’ and ‘g’ indexes to suffer from a lack of differentiation: for example, 52 percent of all participants received a score of zero under both measures, and 92 and 89 percent received scores of two or less under ‘h’ and ‘g’, respectively. Overall, our findings suggest that ‘h’ and ‘g’ indexes should not be incorporated into a PBRF-like framework.

    View record details
  • Evaluating research - Peer review team assessment and journal-based bibliographic measures: New Zealand PBRF research output scores in 2006

    Anderson, David L.; Smart, Warren; Tressler, John (2012-04)

    Working or discussion paper
    University of Waikato

    This paper concerns the relationship between the assessment of the research of individual academics by peer or expert review teams with a variety of bibliometric schemes based on journal quality weights. Specifically, for a common group of economists from New Zealand departments of economics the relationship between Performance-Based Research Fund (PBRF) Research Output measures for those submitting new research portfolios in 2006 are compared with evaluations of journal based research over the 2000-2005 assessment period. This comparison identifies the journal weighting schemes that appear most similar to PBRF peer evaluations. The paper provides an indication of the ‘power or aggressiveness’ of PBRF evaluations in terms of the weighting given to quality. The implied views of PBRF peer review teams are also useful in assessing common assumptions made in evaluating journal based research.

    View record details
  • The New Zealand performance based research fund and its impact on publication activity in economics

    Tressler, John; Anderson, David L. (2013-02)

    Working or discussion paper
    University of Waikato

    New Zealand’s academic research assessment scheme, the Performance Based Research Fund (PBRF), was launched in 2002 with the stated objective of increasing research quality in the nation’s universities. Evaluation rounds were conducted in 2003, 2006 and 2012. In this paper, we employ 22 different journal weighting schemes to generate output estimates of refereed journal paper and page production over three six year periods (1994-1999; 2000-2005 and 2006-2011). These time periods reflect a pre-PBRF environment, a mixed assessment period, and a pure PBRF research environment, respectively. Our findings indicate that, on average, research productivity, defined in either paper or page terms, has increased since the introduction of the PBRF. However, this outcome is due to a major increase in the quantity of papers and pages produced per capita that has more than off-set a decline in the quality of published outputs since the introduction of the PBRF. In other words, our findings suggest that the PBRF has failed to achieve its stated goal of increasing average research quality, but it has resulted in substantial gains in productivity achieved via large increases in the quantity of refereed journal articles.

    View record details
  • Research output in New Zealand Economics Departments 2000-2006

    Anderson, David L.; Tressler, John (2008-04)

    Working or discussion paper
    University of Waikato

    This paper considers the research productivity of New Zealand based economics departments over the period 2000 to 2006. It examines journal based research output across departments and individuals using six output measures. We show that Otago and Canterbury performed consistently well over the period, with Otago generally the highest ranked department. The measures used place different emphasis on ‘quality’ versus ‘quantity’. Which measure is used has a significant influence on the rankings of Auckland, Victoria and Waikato. The controversy surrounding the inclusion of ‘visitors’ and the influence of research stars is considered. Rankings of the leading individual researchers are provided.

    View record details
  • Do research assessment exercises raise the returns to publication quality? Evidence from the New Zealand market for academic economists

    Gibson, John; Tressler, John; Anderson, David L. (2008-08)

    Working or discussion paper
    University of Waikato

    Many countries have introduced research assessment exercises to help measure and raise the quality of research in their university sector. But there is little empirical evidence on how these exercises, such as the Quality Evaluation of the Performance Based Research Fund (PBRF) in New Zealand and the recently aborted Research Quality Framework (RQF) in Australia, affect the signals that researchers observe in the academic labour market. Since these assessments aim to raise research quality, individual academics should perceive rising returns to publication quality at the expense of the returns to quantity. Data we collected on the rank and publication records of New Zealand academic economists prior to the introduction of the PBRF and just after the second assessment round are used to estimate the changing returns to the quantity and quality of journal articles.

    View record details
  • The merits of using citations to measure research output in economics departments: The New Zealand case

    Anderson, David L.; Tressler, John (2011-05)

    Working or discussion paper
    University of Waikato

    In this paper we explore the merits of utilizing citation counts to measure research output in economics in the context of a nation-wide research evaluation scheme. We selected one such system for study: the New Zealand government’s Programme-Based Research Fund (PBRF). Citations were collected for all refereed papers produced by New Zealand’s academic economists over the period 2000 to 2008 using the databases of the ISI/Web of Science and, to a limited extent, Google Scholar. These data allowed us to estimate the time lags in economics between publication of an article and the flow of citations; to demonstrate the impact of alternative definitions of ‘economics-relevant’ journals on citation counts; and to assess the impact of direct citation measures and alternative schemes on departmental and individual performance. Our findings suggest that the time-lags between publication and citing are such that it would be difficult to rely on citations counts to produce a meaningful measure of output in a PBRF-like research evaluation framework, especially one based explicitly on individual assessment.

    View record details
  • The excellence in research for Australia Scheme: An evaluation of the draft journal weights for economics

    Anderson, David L.; Tressler, John (2009-07)

    Working or discussion paper
    University of Waikato

    In February 2008, the Australian government announced its intention to develop a new quality and evaluation system for research conducted at the nation’s universities. Although the Excellence in Research for Australia (ERA) scheme will utilize several measures to evaluate institutional performance, we have chosen to focus on one element only: the assessment of refereed journal article output based on ERA’s own journal weighting scheme. The ERA weighting scheme will undoubtedly shape the reward structure facing university administrators and individual academics. Our objective is to explore the nature of the ERA weighting scheme for economics, and to demonstrate how it impacts on departmental and individual researcher rankings relative to rankings generated by alternative schemes employed in the economics literature. In order to do so, we utilize data from New Zealand’s economics departments and the draft set of journal weights (DERA) released in August 2008 by ERA officials. Given the similarities between Australia and New Zealand, our findings should have relevance to the Australian scene. As a result, we hope to provide the reader with a better understanding of the type of research activity that influences DERA rankings at both the departmental and individual level.

    View record details
  • The ‘Excellence in Research for Australia’ scheme: A test drive of draft journal weights with New Zealand data

    Anderson, David L.; Tressler, John (2009)

    Journal article
    University of Waikato

    The paper assesses the draft weighting system used by the Excellence in Research for Australia (ERA) scheme for measuring the output of refereed economics journal articles. It does so by using data from New Zealand’s economics departments to demonstrate how the rankings of departmental and individual researchers are affected by the use of ERA weights rather than alternative weights employed in the economics literature. It concludes that the draft version of the ERA scheme, as released in August 2008, rewards research quantity over research quality, as traditionally defined.

    View record details
  • The Relevance of the “h-” and “g-” Index to Economics in the Context of A Nation-Wide Research Evaluation Scheme: The New Zealand Case

    Anderson, David L.; Tressler, John (2013)

    Journal article
    University of Waikato

    The purpose of this paper is to explore the relevance of the citation-based “h-” and “g-” indexes as a means for measuring research output in economics. This study is unique in that it is the first to utilise the “h-” and “g-”indexes in the context of a time-limited evaluation period and to provide comprehensive coverage of all academic economists in all university-based economics departments within a nation state. For illustration purposes, we have selected the New Zealand's Performance-Based Research Fund (PBRF) as our evaluation scheme. To provide a frame of reference for “h-” and “g-”index-output measures, we have also estimated research output using a number of journal-based weighting schemes. In general, our findings suggest that “h-” and “g-”index scores are strongly associated with low-powered journal ranking schemes and weakly associated with high powered journal weighting schemes. More specifically, we found the “h-” and “g-”indexes to suffer from a lack of differentiation: for example, 52 per cent of all participants received a score of zero under both measures, and 92 and 89 per cent received scores of two or less under “h-” and “g-” respectively. Overall, our findings suggest that “h-” and “g-”indexes should not be incorporated into a PBRF-like framework.

    View record details
  • The merits of using citation‐based journal weighting schemes to measure research performance in economics: The case of New Zealand

    Anderson, David L.; Tressler, John (2010-05)

    Journal article
    University of Waikato

    In this study we test various citation‐based journal weighting schemes, especially those based on the Liebowitz and Palmer methodology, as to their suitability for use in a nationwide research funding model. Using data generated by New Zealand’s academic economists, we compare the performance of departments, and individuals, under each of our selected schemes; and we then proceed to contrast these results with those generated by direct citation counts. Our findings suggest that if all citations are deemed to be of equal value, then schemes based on the Liebowitz and Palmer methodology yield problematic outcomes. We also demonstrate that even between weighting schemes based on a common methodology, major differences are found to exist in departmental and individual outcomes.

    View record details
  • Ranking economics departments in terms of residual productivity: New Zealand economics departments, 2000–2006

    Anderson, David L.; Tressler, John (2011)

    Journal article
    University of Waikato

    This paper utilises a human-capital approach for ranking the research productivity of academic departments. Our approach provides rankings in terms of residual research output after controlling for the key characteristics of each department's academic staff. More specifically, we estimate residual research output rankings for all of New Zealand's economics departments based on their publication performance over the 2000 to 2006 period. We do so after taking into account the following characteristics of each department's academic staff: gender, experience, seniority, academic credentials and academic rank. The paper demonstrates that the rankings generated by the residual research approach and those generated by traditional approaches to research rankings may be significantly different for some departments. These differences are important in determining the likely efficiency impact of research assessment exercises.

    View record details
  • Evaluating research – peer review team assessment and journal based bibliographic measures: New Zealand PBRF research output scores in 2006

    Anderson, David L.; Smart, Warren; Tressler, John (2013)

    Journal article
    University of Waikato

    This paper concerns the relationship between the assessment of the research of individual academics by peer or expert review teams with a variety of bibliometric schemes based on journal quality weights. Specifically, for a common group of economists from New Zealand departments of economics the relationship between Performance-Based Research Fund (PBRF) Research Output measures for those submitting new research portfolios in 2006 are compared with evaluations of journal-based research over the 2000–2005 assessment period. This comparison identifies the journal weighting schemes that appear most similar to PBRF peer evaluations. The paper provides an indication of the ‘power or aggressiveness’ of PBRF evaluations in terms of the weighting given to quality. The implied views of PBRF peer review teams are also useful in assessing common assumptions made in evaluating journal based research.

    View record details
  • Which journal rankings best explain academic salaries? Evidence from the University of California

    Gibson, John; Anderson, David L.; Tressler, John (2012-08)

    Working or discussion paper
    University of Waikato

    The ranking of an academic journal is important to authors, universities, journal publishers and research funders. Rankings are gaining prominence as countries adopt regular research assessment exercises that especially reward publication in high impact journals. Yet even within a rankings-oriented discipline like economics there is no agreement on how aggressively lower ranked journals are down-weighted and in how wide is the universe of journals considered. Moreover, since it is typically less costly for authors to cite superfluous references, whether of their own volition or prompted by editors, than it is to ignore relevant ones, rankings based on citations may be easily manipulated. In contrast, when the merits of publication in one journal or another are debated during hiring, promotion and salary decisions, the evaluators are choosing over actions with costly consequences. We therefore look to the academic labor market, using data on economists in the University of California system to relate their lifetime publications in 700 different academic journals to salary. We test amongst various sets of journal rankings, and publication discount rates, to see which are most congruent with the returns implied by the academic labor market.

    View record details