Article Access Statistics | | Viewed | 1860 | | Printed | 44 | | Emailed | 0 | | PDF Downloaded | 19 | | Comments | [Add] | |
|

 Click on image for details.
|
|
|
VIEWPOINT |
|
|
|
Year : 2021 | Volume
: 67
| Issue : 2 | Page : 91-92 |
Ranking list for scientists: From heightening the rat-race to fraying the scientific temper
A Rammohan, M Rela
Institute of Liver Disease and Transplantation, Dr. Rela Institute and Medical Centre, Chennai, Tamil Nadu, India
Date of Submission | 03-Feb-2021 |
Date of Decision | 24-Feb-2021 |
Date of Acceptance | 15-Mar-2021 |
Date of Web Publication | 30-Apr-2021 |
Correspondence Address: A Rammohan Institute of Liver Disease and Transplantation, Dr. Rela Institute and Medical Centre, Chennai, Tamil Nadu India
 Source of Support: None, Conflict of Interest: None  | Check |
DOI: 10.4103/jpgm.JPGM_112_21
Citations and validation of work play a crucial and integral role in a researcher's career. Ranking systems of scientists, on the other hand, potentially scratch and expose the fallible egoistic human face of science, leading to an unhealthy milieu of competition rather than the uplifting one of motivation. We have attempted to highlight and bring to fore these factors in our brief viewpoint. We critically analyze the reasons why ranking systems of scientists, especially in the field of medicine, will shift the focus from advancement of science to advancement of “self.”
Keywords: Ethics, medical professionals, ranking lists
How to cite this article: Rammohan A, Rela M. Ranking list for scientists: From heightening the rat-race to fraying the scientific temper. J Postgrad Med 2021;67:91-2 |
“Scientific temper” describes an attitude which involves the application of logic. Discussion, argument, and analysis are vital parts of scientific temper and the elements of fairness, equality, and democracy are built into it.[1],[2] This is an ethos every scientist imbibes and applies on a daily basis. There is, however, a human and fallible side to every professional; one of competition, envy, and comparison. The ego needs of an individual and the quest for validation break through the objectivity of science. A recent publication which ranks the top 100000 scientists with over 5 publications and those within the top 2% in their field (total of 159000 scientists) based on a database of 6.8 million authors (Scopus) across 176 scientific sub-fields appears to feed into this frenzy.[3]
The authors of this ranking system from Stanford University are to be commended for their tremendous statistical exercise at unbiasedly defining and ranking the impact and scientific excellence of the human brain. They based the ranking on a composite score which took into account 6 citation metrics (total citations; Hirsch h-index; coauthorship-adjusted Schreiber hm-index; number of citations to papers as single author; number of citations to papers as single or first author; and number of citations to papers as single, first, or last author).[3] The merits of this new ranking system include the fact that it will help in the objective assessment of the quality of individual contributions of scientists. Moreover, its positive impact of validating those on the list for their scientific contributions, and motivating the younger generation to achieve excellence in their respective fields remains undeniable. This is especially important in developing countries, especially in the Asia-Pacific region where a complex interrelation between logistical support and the academic community prevents the implementation of internationally recognized criteria in the process of recognition and appreciation of scientific contributions. For example, it was noteworthy to see a large number of Indian faculty in the world's 2% list of scientists. Nearly 1500 of them were placed highly in their respective areas of expertise. While there has been a tremendous progress in the medical sciences, and consequently a large number of publications from the Indian subcontinent in recent years, it was indeed heartening to see other nonmedical specialties like physics, material sciences, chemical engineering, plant biology, energy etc., being well represented on the list. Another interesting statistic was that a majority of the scientists in the non-medical fields who found their names on the list were from public sector institutes.
Although the authors' intentions are indeed honorable (e.g.: excluding self-citations, etc.), ranking systems, however objective and validated they might be, do throw up many gray questions. Citations and validation of work play a crucial and integral role in a researcher's career. Citation analyses are used for various single-person or comparative assessments in the complex reward and incentive system of science. These include award of grants, hiring, promotion, or tenure decisions. It must also be acknowledged that the existing systems of single citation metrics are not infallible, and calculations are not standardized across fields.[4] Nevertheless, new ranking systems will worsen the already existing competition in the world of science, and are only likely to add fuel to an already ablaze fire.[4] The vicious rivalry and inevitable resentment on display is not unknown in science; is there a need to reinforce this trend? The modern education system in schools is doing away with formal examination and rankings to avoid the very same academic and peer pressures, which seem to be plaguing the adult scientific world.
Moreover, ranking systems are also likely to create unhealthy practices, and an abuse of the system is inevitable. Least of the problems are “self-citations.” Stories abound of manipulations like the one of a highly cited biophysicist who was removed from the editorial board of a journal after manipulating the peer-review process to amass citations to his own work.[5] As the editor, he would ask the authors to cite a long list of his publications and that they change their papers to mention an algorithm he had developed. Self-citations are not uncommon and as Ioannidis et al. highlight, while the median self-citation rate is 12.7%, over 250 scientists have amassed over 50% of citations from themselves or their co-authors.[4] Taking this to a whole new level are the “citation farms” where in clusters of scientists massively cite each other.[4]
When scientists get ranked by systems like the one from Stanford University, it marks a status; those who are excluded from the list might feel unfairly left out. It is worth mentioning that these databases are neither infallible nor exhaustive. A 98.1% average precision and 94.4% average recall rates as achieved in the current system would mean 8900 scientists will find their names excluded from “the exalted list,” something which the senior author (MR) finds himself in.[3] It is also intuitive for professionals to quote their standing to promote themselves. Validating as these may be, it might not be the most ethical way to approach certain scientific steams like medicine. Herein, objective healthcare outcomes may not get entirely reflected by a medical professional's scientific ranking. The ranking system in question is from a prestigious institute using one of the most exhaustive scientific (Scopus) databases available. This trend is however likely to spawn more such rankings systems which may not be so scientifically objective or whetted. Dubious ranking systems for pecuniary benefit may further expose the dark underbelly of this egoistic rat-race.
A true pursuit of science should be without worrying so much as to where one falls in the pecking order. Instead, it should be based on the focus of one's own accomplishments and be validated on their merit: an utopian goal? Ranking systems are likely to compound the arsenic of competition rather than the strychnine of motivation: shifting the focus from advancement of science to advancement of “self.”
Financial support and sponsorship
Nil.
Conflicts of interest
There are no conflicts of interest.
:: References | |  |
1. | |
2. | Mahanti S. A Perspective of Scientific Temper in India. J Sci Temper 2013;1:46-62. |
3. | Ioannidis JPA, Boyack KW, Baas J. Updated science-wide author databases of standardized citation indicators. PLoS Biol 2020;18:e3000918. |
4. | Van Noorden R, Singh Chawla D. Hundreds of extreme self-citing scientists revealed in new database. Nature 2019;572:578-9. |
5. | Van Noorden R. Highly cited researcher banned from journal board for citation abuse. Nature 2020;578:200-1. |
|