Braun Tibor, Schubert András (szerk.): Szakértői bírálat (peer review) a tudományos kutatásban : Válogatott tanulmányok a téma szakirodalmából (A MTAK Informatikai És Tudományelemzési Sorozata 7., 1993)
RUSTUM ROY: Alternatives to Review by Peers: A Contribution to the Theory of Scientific Choice
142 ROY: ALTERNATIVES TO REVIEW BY PEERS from the more important issue of what different systems are available for the distribution of research funds. Defects of the Present System The many government agencies which use the system of review of proposals by peers, and which have sponsored many attempts to validate their system, 3 have not supported a single study to compare such review with other systems of allocating funds for research by scientists of similar qualifications; there has been no comparison with the "strong-manager" method used by the United States Department of Defense, or with the formula system. No effort has been made to examine the "efficiency" of the system in terms of the costs and time required for each grant, or the efficacy of the system in supporting genuine innovation. 4 Even without systematically analysed comparative data, the failures of review by peers as a way of deciding which projects and which scientists should receive grants seem to be very evident. Yet virtually no senior official has commented on the glaring deficiencies of these procedures. Let us examine a recent example. In 1983 the Department of Defense started a new programme making available $30,000,000 annually to provide some large items of research equipment to universities. The Department of Defense deviated from the procedures used by many of its own subdivisions, which could presumably have selected for those universities working with the Department the articles of equipment most needed. Instead, in an effort to gain public favour, it issued an invitation for proposals to all universities, whether or not there were research groups with significant support from the Department of Defense in those universities. This resulted in a fiasco. Over 2,200 proposals were received for a sum of $625 million. The success ratio was less than 1:60. The time required for the preparation and submission of each proposal may be estimated at one month's work of one person. Thus, 2,200 scientists spent one month —or nearly 200 years of scientific work were diverted from research. Fortunately, scientific peers were not used to evaluate the process; but any estimate of the total expenditure of time must come to perhaps one more year of one scientist's work for each grant made. If we include an average figure of 100 per cent for overhead costs, the cost therefore was equal to 400 average salaries at $40,000 per scientist per year, and the allocation of $30,000,000 cost about $16,000,000. This example does not include the administrative costs of a typical review by post, or of panels 3 Cole, Stephen and Cole, Jonathan, "The Ortega Hypothesis: Criterion Analysis suggests that only a Few Scientists contribute to Scientific Progress". Science. CLXXVIII (27 October, 1972), pp. 368-374; Cole, S. and Cole, J. R. and Simon, G. A., "Chance and Cpnsepsus in Peer Review", Science, CCXIV (20 November, 1981), pp. 187-255. 4 Roy, Rustum, "Peer Review of Proposals: Rationale. Practice and Performance", Bulletin of Science, Technology and Society, II, 5 (1982), pp. 405-418; Roy, Rustum, Testimony at -Hearings on "NSF Peer Review" before the US House of Representatives Subcommittee on Science and Technology, No. 32 (29 July. 1975), pp. 684-693.