the-fallacy-of-rankings

Context: The application of a ranking methodology designed for an open and dynamic system when applied to a closed one only ends up conflating things.

Objective of rankings:

  • To guide and assist prospective students in choosing from among the best institutions and the courses offered by them, the ranking system comes off as an innovative attempt at quantifying excellence.

Significance of ranking:

  • With 30,000 institutions of different types and standards, the Indian Higher Education System (IHES) is the world’s third-largest system.
  • The need to map institutional unevenness and the inherent qualitative disparities in standards make rankings significant.
  • These rankings of Institutions at the national level instill a competitive spirit amongst institutions to perform better and secure a higher rank in the international ranking.

Ranking methodology:

  • According to the declared methodology (2019) in the college category, the NIRF allocates 40 per cent weightage to teaching-learning outcomes, which is derived from a mathematical calculation of student strength, students-teacher ratio, permanent/temporary appointments, number of PhD holders in faculty positions as well as financial resource utilisation. 
  • It accords 10 points to outreach and diversity quotients, another 15 and 25 per cent to research output and graduation outcomes respectively, and 10 points to perceptions amongst employers and academic peers.

Issues with ranking:

  • NIRF rankings fail in accomplishing the mandate of segregating the colleges.
    • The so-called top colleges top the charts largely on account of archaic perceptions, triggering the best intakes, which makes them score high on academic parameters.
    • Being government-funded, they are bound to admit all those who make it past their declared cut-off marks.
  • Diversity quotient: None of these colleges has any say in designing any policy parameter that would encourage or discourage students’ recruitments from varied backgrounds. 
  • Resources are rigidly fixed and even the slightest deviations attract penal action. 
  • The appointments and promotions of faculty members are controlled by their respective affiliating universities, and entirely independent of any merit-based distributive mechanism. 
  • Existing regulatory mechanisms of the IHES do not allow any flexibility to develop innovative pedagogy or outreach and diversity formulae on their own.

The application of a ranking methodology designed for an open and dynamic system when applied to a closed one only ends up conflating and confounding realities.

About NIRF:

  • The National Institutional Ranking Framework (NIRF) was approved by the MHRD and launched on 29th September 2015.
  • This framework outlines a methodology to rank institutions across the country. 
  • The methodology draws from the overall recommendations broad understanding arrived at by a Core Committee set up by MHRD, to identify the broad parameters for ranking various universities and institutions. 
  • The parameters broadly cover: 
    • “Teaching, Learning and Resources” 
    • “Research and Professional Practices” 
    • “Graduation Outcomes” 
    • “Outreach and Inclusivity” and 
    • “Perception”
  • This ranking exercise has also created a habit of organizing the data by the institutions and most of all these institutions attempt themselves to become more competitive.
  • Aside from the overall rankings, NIRF also lists out best institutions across another nine categories — college, university, medical, engineering, management, pharmacy, law, architecture, and dental.


Source: https://indianexpress.com/article/opinion/columns/the-fallacy-of-rankings-6496700/