This is an expanded version of an article that appeared in the Indian Express on October 5 2013. As you can see, I have used some of the material from my earlier blog post on this subject.
Rankings: Why is India Nowhere?
Gautam Barua
QS and THE have recently released
their 2013 global University rankings.
In both the rankings, institutions from India are nowhere in the
picture. In fact, their ranks have fallen compared to last year. Are
institutions from India that bad? Have institutions from India been “lazy” in
providing the right data? To show that there are problems with the rankings, I analyse the parameters of IIT
Guwahati (since these are available to me, but the results can be easily
generalized to other Indian Institutes; IIT Delhi and Panjab Univ. data are also shown). The
first table is for QS and the second for THE.
Rank
|
Institution
|
Academic Reputation (AR)
|
Employer Reputation (ER)
|
Faculty: Student Ratio (FS)
|
Citations / faculty (CF)
|
Intl faculty (IF)
|
Intl students (IS)
|
Total Score
|
40%
|
10%
|
20%
|
20%
|
5%
|
5%
|
100%
|
||
Max
marks
|
100
|
100
|
100
|
100
|
100
|
100
|
100
|
|
212
|
IITD (2012)
|
53.4
|
79.4
|
35.4
|
54.1
|
1.3
|
1.6
|
47.8
|
222
|
IITD (2013)
|
51.3
|
86.8
|
39.9
|
63.3
|
1.3
|
1.5
|
49.4
|
551-600
|
IITG (2012)
|
16.7
|
24.7
|
35.5
|
26.6
|
3.0
|
1.2
|
22.0
|
601-650
|
IITG
(2013)
|
20.4
|
22.3
|
30.2
|
33.1
|
1.5
|
1.6
|
23.3
|
Rank
|
Institution
|
Teaching
|
Research
|
Citations
|
Industry Income
|
International Outlook
|
226-250
|
Panjab
|
25.8
|
14.0
|
84.7
|
28.4
|
29.3
|
351-400
|
IITD
|
33.8
|
23.0
|
38.5
|
-
|
15.3
|
?
|
IITG
|
22.0
|
12.0
|
54.0
|
17.0
|
18.0
|
All
scores are relative, with the top ranked institution in each category getting a
score of 100.
1. The FS
ratio of IITG is the best among IITs (according to the information tabled in
the lok Sabha and as reported by the media) but it is showing a decline and is
much worse than IITD’s in the QS table. Clearly there is an error here. IITs have not done well
in the faculty to student ratio, and it is well known that there are many
vacant faculty positions. But even here, the methodology is not clear. MIT
states in its web site that it has a faculty to student ratio of 1:8. It also
states that the total number of students is 10,894 students. The number of
faculty is given as 1018, whereas 1362 are required to meet the above ratio. Then
the number of “senior lecturers, lecturers, and Professors emeriti” are given
as 540 (most of whom are on contract for temporary periods. Possibly 1362 is
reached after deleting the “Professors Emeriti”. But how are scores
calculated? Consider this: MIT with a 1:8 ratio gets a score of 99.9,
while IIT Guwahati with a ratio of 1:13.3 (as per QS) gets only 35.5. I am not
able to figure out how this score was arrived at. In any case, is this a fair
marking system? Shouldn’t an ideal ratio get full marks and other ratios be
given lesser marks in slabs?
2. IITs are not allowed to take international students at the
B.Tech level unless they sit for JEE.
Very few do. There is scope for
increasing the number of foreign PhD students. But even here there is a
restriction, as Govt. Assistantships can be given only to Indian citizens. Without
aid, it is difficult to attract good international PhD students. Hiring
international faculty on a regular basis is not allowed. They can be
hired on contract upto five years, but only if the salary is at least
$25000 annually (only Profs are allowed effectively). The moot question
remains: is the internationalization of campuses an important parameter for
excellence? The Western countries are clearly at an advantage here with
Indians, Chinese, and others going to their Univs in droves..
3 50% of the weightage is based on
“reputation” (AR: 40% and ER: 10%) in QS and 33% in THE (not shown above). IIT
Guwahati got a score of 0 for Academic reputation and a score of 1 in Research
reputation in THE. How did they arrive at these scores? It is a wonder that top
students of the country competing to attend an institution with a ZERO academic
reputation! These scores are getting
reflected in the scores for Teaching and Research in the table above. These
organisations are now aggressively marketing their products through which
institutions can enhance their “reputation”. Thus we have been invited to
advertise in their “QS Top University Guide 2013” (with discounts if we opt to
advertise in more than one language) and in other publications, to attend
seminars and conferences (with registration fees of course), and so on. Can we
rely on reputations primarily to decide ranks?
Academics all over the world are asked their opinion of the top
institutions globally and in their country. The chances of getting an IIT’s
name included by a US professor are quite slim. Alumni in academics may
help the IITs as they know about them, but the number of alumni in academics in
the US is a small fraction of the number of alumni in the US. This is a legacy
of the past. Young institutions like IIT Guwahati without sufficient alumni,
are at a particular disadvantage. The number of respondents is proportional to
the number of institutes available for selection in that country. So the
responses are heavily weighted in favour of responses from the developed
countries. Respondents are not asked to give their inputs for each of the
listed universities (it may be impractical to do so, as there are a large
number of them). Instead, each respondent is asked to give a list of 5-10
Universities he or she thinks are globally well known, and well known in their
country. This method perpetuates the existing ranks.
4. Now, consider the categories CF and Citations. The total number of
citations in the last 5 years is divided by the number of faculty in the last
year by QS. IIT Guwahati had 323 faculty in 2013 but only 220 in 2009. So its
numbers clearly cannot be compared with institutions like Cambridge and Oxford
where the faculty numbers are almost constant. Further, since a five year
average is taken, one or two “star” papers can make a huge difference to the
numbers. For example, a review paper “The Hallmarks of Cancer” authored by two
professors from UC San Francisco and MIT has about 10,000 citations. This paper
alone will have boosted the CF figure of both these institutions significantly.
THE clearly uses some other method for citations. It probably does not remove
self citations. The high scores of
Panjab and Guwahati vis-à-vis IIT Delhi could be explained by this. Panjab University’s High Energy Physics group
(and to a lesser extent IITG’s) is part of global experiments at CERN, Belle,
Fermi Labs, and papers from that project have very high citations. Thus a small
of group of international collaborations are providing a high score. Isn’t the median number of citations per faculty
a better measure than the average (there are other issues, for example,
citations in the Sciences are usually much more than in Engineering)?
So, what can we conclude from all of the
above? Surely it should be clear, ranking of universities is not a simple task.
We have only scratched the surface as have QS and THE. There are so many other
aspects of an educational institution that they have not even touched upon.
Many of these aspects are qualitative in nature, and it is very difficult to
quantify them. This is not to say that Indian Universities do not need to
improve their rankings. They do, and to begin with, we will have to provide
data to these organizations in the format they expect. Interactions are already
on. But if Society wants Indian Institutions to get appreciably higher QS and THE rankings, then they must allow the Institutions
to do the following: a) spend heavily to aggressively market the Institute
among academia and Corporations in the US and Europe, b) substantially increase
the number of foreign students (Govt. must allow UG admissions and allow Govt.
assistantships to foreigners, and remove ceilings on incomes for foreign
faculty) c) hire a large number of
temporary “teachers” to boost the FS number (which counts the number of
“academic staff”, and which apparently is done by many US universities), d)
create a network among Indian Institutions to encourage citations of papers of
other Indian Institutions (scratch each others’ backs). Finally, of course, all
Institutions must strive to improve the
quality and quantity of research, teaching, industry interaction, etc.