Why no Indian Institution is in the top 200 in the QS global rankings?
Gautam Barua, IIT Guwahati
QS has recently
released its 2012 global University rankings. Since QS markets its product in
India, there has been quite a bit of press on the rankings. The fact that no
Indian University has come in
the first
200 ranks have made “news”. There have
been many reasons given by various people, ranging from a lack of research,
inadequate funding, excelling in undergraduate education only, and indifferent
academic administrators. Rankings are done on the basis of a number of
parameters which the ranking organisation decides upon. Let us look at the
ranks in QS 2012 of the IITs and examine the basis of their rankings. The
details of the ranks of the IITs are as follows (the scores of MIT are given
for comparison’s sake):
Rank
|
Institution
|
Academic Reputation (AR)
|
Employer Reputation (ER)
|
Faculty: Student Ratio (FS)
|
Citations / faculty (CF)
|
Intl faculty (IF)
|
Intl students (IS)
|
Total Score
|
40%
|
10%
|
20%
|
20%
|
5%
|
5%
|
100%
|
||
max marks
|
100
|
100
|
100
|
100
|
100
|
100
|
100.0
|
|
1
|
MIT
|
100.0
|
100.0
|
99.9
|
99.3
|
86.4
|
96.5
|
100.0
|
212
|
IITD
|
53.4
|
79.4
|
35.4
|
54.1
|
1.3
|
1.6
|
47.8
|
227
|
IITB
|
59.5
|
82.7
|
28.9
|
38.2
|
3.2
|
1.2
|
46.2
|
278
|
IITK
|
44.1
|
50.6
|
28.2
|
57.2
|
1.5
|
1.1
|
40.3
|
312
|
IITM
|
40.0
|
73.0
|
30.9
|
40.3
|
2.1
|
1.2
|
38.1
|
349
|
IITKGP
|
34.6
|
42.3
|
32.7
|
47.1
|
0.0
|
1.1
|
34.4
|
401-450
|
IITR
|
37.7
|
40.7
|
|||||
551-600
|
IITG
|
16.7
|
24.7
|
35.5
|
26.6
|
3.0
|
1.2
|
22.0
|
All scores
are relative, with the number 1 ranked institute in that category getting a
score of 100. As can be seen, there are
five parameters with different weights assigned to each parameter. The category
CF refers to the average number of times
a published paper is cited by other papers in which the original authors are
not authors. The number of citations of all papers over five years (2007-2011)
were totalled and divided by the number of faculty in the Institute in 2011. If
a paper is cited often, it is assumed that its quality is good. Unfortunately,
the “citation index” depends on the size of the research population and so it
is difficult to compare across disciplines. We note the following:
1.
All
IITs are at a disadvantage on the international faculty and students issues. We
are not allowed to take international students at the B.Tech level (other than
through JEE). With so much demand within the country, there is pressure not to
have too many foreign students. There is scope for increasing the number of
foreign PhD students. But even here there is a restriction, as Govt.
Assistantships can be given only to Indian citizens. We are trying to get this restriction
lifted. Without aid, it is difficult to attract good international PhD
students. Hiring international faculty on a regular basis is not allowed. They can be hired on contract
upto five years, but only if the salary is at least $25000 annually (only Profs are allowed effectively). The moot question
remains: is the internationalization of campuses an important parameter for
excellence? The Western countries are clearly at an advantage here.
2.
50%
of the weightage is based on “reputation” (AR: 40% and ER: 10%). This helps QS
a lot. They are now aggressively marketing their products through which institutions
can enhance their “reputation”. Thus we have been invited to advertise in their
“QS Top University Guide 2013” (with discounts if we opt to advertise in more
than one language) and in other publications, to attend seminars and
conferences (with registration fees of course), and so on. Can we rely on
reputations primarily to decide ranks?
3.
AR and ER have a weightage of 85% for
international responses and 15% for domestic responses (the country the
institute is). Academics all over the
world are asked their opinion of the top institutions globally and in their
country. The chances of getting an IIT’s name included by a US professor are
quite slim. Alumni in academics may help
the IITs as they know about them, but the number of alumni in academics in the
US is a small fraction of the number of alumni in the US. This is a legacy of
the past. Young institutions like IIT Guwahati without sufficient alumni, are
at a particular disadvantage. The number of respondents is proportional to the
number of institutes available for selection in that country. So the responses
are heavily weighted in favour of responses from the developed countries. Respondents
are not asked to give their inputs for each of the listed universities (it may
be impractical to do so, as there are a large number of them). Instead, each respondent
is asked to give a list of 5-10 Universities he or she thinks are globally well
known, and well known in their country. This method perpetuates the existing ranks.
4.
Let
us now look at the category FS. IITs have not done well in the faculty to student ratio, and it is
well known that there are many vacant faculty positions. But even here, the
methodology is not clear. MIT states in its web site that it has a faculty to
student ratio of 1:8. It also states that the total number of students is
10,894 students. The number of faculty is given as 1018, whereas 1362 are
required to meet the above ratio. Then the number of “senior lecturers,
lecturers, and Professors emeriti” are given as 540 (most of whom are on
contract for temporary periods. Possibly 1362 is reached after deleting the “Professors
Emeriti”. But how are scores calculated?
Consider this: MIT with a 1:8 ratio gets
a score of 99.9, while IIT Guwahati with a ratio of 1:13.3 (as per QS) gets
only 35.5. I am not able to figure out how this score was arrived at. In any
case, is this a fair marking system? Shouldn’t an ideal ratio get full marks
and other ratios be given lesser marks in slabs?
5.
Finally
we have the category CF. This gives the
average number of citations for the papers published in the last 5 years. The
assumption is that the number of faculty has remained more or less constant
(generally true for “old” institutions). But IIT Guwahati had 291 faculty in
2011 but only 191 in 2007. So its numbers clearly cannot be compared with
institutions like Cambridge and Oxford. Further, since a five year average is
taken, one or two “star” papers can make a huge difference to the numbers. For
example, a review paper “The Hallmarks of Cancer” authored by two professors
from UC San Francisco and MIT has about 10,000 citations. This paper alone will
have boosted the CF figure of both these institutions significantly. Is this
the right way to judge whether an Institution is doing well in research? Isn’t
the median number of citations of faculty a better measure than this (although
even this has its pitfalls)?
So, what
can we conclude from all of the above? Surely it should be clear, ranking of
universities is not a simple task. We have only scratched the surface as has
QS. There are so many other aspects of an educational institution that QS has
not even touched upon. The same can be said of the other global ranking systems
like THE and ARWU. Many of these aspects are qualitative in
nature, and it is very difficult to quantify them. This is not to say that
Indian Universities do not need to improve in many ways. They do, and we may
have to come out with our own ranking system to get a proper comparison among
universities in India. But if Society wants Indian Institutions to get higher
QS rankings, then Institutions must do the following: a) aggressively market
the Institute among academia and Corporations in the US and Europe (and QS is
there to help you! For a price of course, and we need to set aside a part of
our budget for them!), b) Substantially increase the number of foreign students
( why not scrap JEE and admit only foreign students? Our income will rise, and
our ranks will soar!), c) hire a large number of temporary “teachers” to boost
the FS number (which counts the number of “academic staff”), d) create a
network among Indian Institutions to encourage citations of papers of other
Indian Institutions (scratch each others’ backs, as some other countries seem
to be doing), and e) and of course try and improve the quality of research,
teaching, education, etc.!
17/9/2012