Abstract
The paper presents an experimental method for the evaluation of scientific papers in the field of oncology and related disciplines developed at the National Institute for Cancer Research (IST), Genoa, Italy. The method is based on the partitioning of categories of theScience Citation Index-Journal Citation Reports (SCI-JCR) into deciles, thus normalizing Impact Factor (IF), in order to guage the quality of the productivity. A second parameter related to the number of staff of each department co-authoring a given paper has been introduced for the allocation of Institute funding. The following studies have been carried to compare the assigned score and the average number of citations of papers published by a research group. The identification of correctives is in progress. The method provides a basis for a possible method to judge the quality of publications from within a research organization, and should be reproducible independently of the disciplines considered.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
D. Kennedy, Government policies and the cost of doing research,Science, 227 (1985) 480–484.
G. E. Brown, Report of the Task Force on the Health of Research Chairmans's Report to the Committee on Science, Space, and Technology, U. S. House of Representatives, US Government Printing Office, Washington DC, n. 56–819, 1992.
Carnegie Commission on Science, Technology, and Government, Enabling the future: linking science and technology to societal goals, Carnegie Commission, New York, NY, 1992.
Committee on Science, Engineering, and Public Policy, National Academy of Sciences, The government role in civilian technology: building a new alliance, National Academy Press, Washington DC 1992.
F. J. Ingelfinger, Peer review in biomedical publication,American Journal of Medical Sciences, 58 (1974) 686–692.
A. Kohn, C. Putterman, Problems and conflicts in peer review,International Journal of Impotence Research, 5 (1993) 133–137.
R. N. Kostoff, Research impact assessment,Proceedings 3rd Int. Conference on Management of Technology, Miami, February 17–21, 1992.
T. Luukkonen, Bibliometrics and evaluation of research performance,Annals of Medicine, 22 (1990) n. 3, 145–150.
H. F. Moed, W. J. M. Burger, J. G. Frankfort, A. F. J. Van Raan The application of bibliometric indicators: important field and time-dependent factors to be considered,Scientometrics, 8 (1985) 177–203.
D. Ugolini, G. Alloro, A step by step introduction of an automated system in a medical library,Proc. 2nd European Conference of Medical Libraries, Bologna, November 2–6, 1988, pp. 337–343.
E. Garfield, Citation analysis as a tool in journal evaluation,Science, 178 (1972) 476.
E. Garfield,Citation Indexing. Its Theory and Application in Science, Technology and Humanities, John Wiley & Sons, New York, 1979.
E. Garfield, Is citation analysis a legitimate tool?Scientometrics, 1 (1979) 359–375.
K Arora, B. K Sen, Use of impact factor as a valid measure of evaluating the performance of scientific papers. 3rd International Conference on Informetrics, Bangalore, August 9–12, 1991.
M. P. Carpenter, F. Gibb, M. Harris, J. Irvine, B. R. Martin, F. Narin, Bibliometric profile for British Academic Institution: an experiment to develop research output indicators.Scientometrics, 14 (1988) 213–223.
P. S. Nagpaul, Contribution of indian universities to the mainstream scientific literature: a bibliometric assessment,Scientometrics, 32 (1995) 11–36.
B. K. Sen, Documentation note. Normalised impact factor,Journal of Documentation, 48 (1992) n. 3, 318–325.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Ugolini, D., Parodi, S. & Santi, L. Analysis of publication quality in a Cancer Research Institute. Scientometrics 38, 265–274 (1997). https://doi.org/10.1007/BF02457413
Received:
Issue Date:
DOI: https://doi.org/10.1007/BF02457413