THE-QS University Rankings 2009

The new rankings are out. All Irish universities are up. TCD at 43, UCD at 89.

To pre-empt some critique: No ranking is perfect, but each imperfect ranking correlates with each other imperfect ranking. Two universities in the top 100 is not at all bad for a country that has far less than 2/100 people in the OECD, let alone the world.

27 replies on “THE-QS University Rankings 2009”

I agree – this looks about right – we really could not expect more than two when, say, Australia has eight in the top 100. Two things strike me. Firstly, it will get much harder to move up this list. Perhaps there is more to run on the UCD ranking, and TCD can move a little, but two in the top 70 is perhaps all we could expect or hope in time. In that case we are punching a little below weight. So, secondly looking at the scores it is the research citations and staff/student ratio that is nailing both institutions. They do well on the softer stuff like peer (partic TCD) and employer views etc. Staff/Student ratios will get worse. Citations will be hard to improve. This could be as good as it gets.

I don’t like soft arguments but this is in part something Govt folks need to decide – is a ranking like the THES good for Ireland? If so are Govt/HEA now prepared to view Ireland having two, maybe three major institutions competing internationally and, perhaps, with resources skewed towards them.

For a country the size of Ireland, two good research universities is the maximum. I agree that the implication is that the rest should be transformed into teaching institutions. To soften the blow, this is probably best done per discipline. For example, there are 7 economics departments in Ireland. The two worst ones should be wound down in a decade, the next two should follow later. This will leave pockets of research excellence in each university, from which two great institutions can be molded — hopefully before Colm and I retire.

Your attempts at pre-emption are reasonable Richard, but one would hate for any reader of these rankings to come away with the impression, to take a particularly striking example, that TCD is a better or more prestigious institution than LSE! What is particularly amusing there is that LSE are being massively dragged down by their citation rates (!). This I guess tells you a lot about the hazards of comparing such numbers across disciplines, given what LSE specialises in.

I also note that it is the arts and humanities where TCD scores in the top 50; it is not in the top 50 in any other area. My guess is that the arts and humanities do relatively well in other Irish universities as well. So why are the arts and humanities so undervalued by Irish university administrators?

Incidentally, one set of criteria used here clearly favour smaller countries such as Ireland:

“To get a sense of a university’s international outlook, we measure the proportion of overseas staff a university has on its books (making up 5 per cent of the total score) and the proportion of international students it has attracted (making up another 5 per cent). This gives an impression of how attractive an institution is around the world, and suggests how much the institution has embraced the globalisation agenda.”

Of course, ceteris paribus, these proportions will be higher in smaller than in bigger countries, in the same way that a country’s trade share is higher, ceteris paribus, the smaller that country is.

Finally, note that

“the biggest part of the ranking score – worth 40 per cent – is based on the result of an academic peer review survey. We consult academics around the world, from lecturers to university presidents, and ask them to name up to 30 institutions they regard as being the best in the world in their field.”

Correct me if I’m wrong, but as far as I know the THE ranking only uses English language publications. This would bias US/UK/EI institutions tremendously, especially in arts/humanities/law. For example, the highest ranking Dutch universities are Amsterdam, Delft, and Utrecht, which all have large science faculties. Places that specialise in economics/law/medicine like Maastricht, Rotterdam and Tilburg are nowhere to be found, even though they consistently outperform Amsterdam and Utrecht in national rankings. I’d be interested to know how much of a “native English” bonus there is in the data. I suspect it is substantial and would take off some of the veneer of UCD and TCD’s excellent rankings.

It isn’t clear to me from the website where they get the citation measures (which account for 20% of the index) from. Richard? Scopus? ISI? The real problem I think you are alluding to is that if you publish in Dutch you won’t get cited as much as if you publish in English. But if you want to make a mark, you do actually want people to read you, so English is the way to go anyway.

In event, it is not via citations that the Irish end up doing well, as Colm points out. Rather, our citations performance drags us down.

Incidentally, the citations scores used here are per capita, which is the right way to do it. So the low citations score of Irish universities can’t be blamed on our small size.

No matter what way the figures are interpreted, Ireland comes out well, when the number in the top 100 is adjusted to population. I’ve done a very quick calculation of the number in the top 100 per million population, and compiled the following table:

[ 1] IRELAND 2 (per million pop: 0.44)
[ 2] Denmark 2 (per million pop: 0.38)
[ 3] U. Kingdom 18 (per million pop: 0.30)
[ 4] Neth’lands 4 (per million pop: 0.25)
[ 5] Sweden 2 (per million pop: 0.22)
[ 6] Belgium 1 (per million pop: 0.10)
[ 7] Germany 3 (per million pop: 0.04)
[ 8] France 2 (per million pop: 0.03)
[ 9] Austria 0 (per million pop: 0.00)
[10] Bulgaria 0 (per million pop: 0.00)
[11] Cyprus 0 (per million pop: 0.00)
[12] Czech Rep. 0 (per million pop: 0.00)
[13] Estonia 0 (per million pop: 0.00)
[14] Finland 0 (per million pop: 0.00)
[15] Greece 0 (per million pop: 0.00)
[16] Hungary 0 (per million pop: 0.00)
[17] Italy 0 (per million pop: 0.00)
[18] Latvia 0 (per million pop: 0.00)
[19] Lithuania 0 (per million pop: 0.00)
[20] Luxembourg 0 (per million pop: 0.00)
[21] Malta 0 (per million pop: 0.00)
[22] Poland 0 (per million pop: 0.00)
[23] Portugal 0 (per million pop: 0.00)
[24] Romania 0 (per million pop: 0.00)
[25] Slovakia 0 (per million pop: 0.00)
[26] Slovenia 0 (per million pop: 0.00)
[27] Spain 0 (per million pop: 0.00)

Can’t see from this table that Ireland’s good performance can be explained by its small population.

Some posters have suggested that there might be a bias towards English-speaking countries in the results. Leaving aside the fact that, in the Netherlands and Nordic countries, they speak better English than in the English-speaking countries, there might be something in that. I simply don’t know enough about the survey to form an opinion. But, if we restrict the comparison to English-speaking countries only, Ireland still comes out well, as the following table shows:

[ 1] IRELAND 2 (per million pop: 0.44)
[ 2] Australia 8 (per million pop: 0.38)
[ 3] U. Kingdom 18 (per million pop: 0.30)
[ 4] N. Zealand 1 (per million pop: 0.25)
[ 5] Canada 4 (per million pop: 0.13)
[ 6] U. States 32 (per million pop: 0.11)

The US ranks between Sweden and Belgium in the first list.

Citations are from Scopus.

There is a language bias, but it surely is small in the Netherlands, where MSc theses in Dutch are frowned upon.

“Academic peer-review” is 40% of the weight. That probably explains why the natural sciences dominate, and the universities that specialise in the social sciences come out badly. I was part of their survey, and I did it in a rush as I guess most did. When you ask me “what is the Vrije U known for?” I would say Tanenbaum (the father of the operating system) and Smit (dinosaur meteorite) — and later think that there are also people like Nijkamp (regional economics) and Boomsma (twin psychology) at Vrije. We favour things over thoughts, the natural over the social.

Indeed, no ranking is perfect, but global rankings of universities are so imperfect, with Rupert Murdoch’s THES rankings perhaps taking star position, that one wonders why anyone pays them any attention. The THES rankings are dominated by what are in fact regional popularity measures (perception by a variety of groups) that are then grouped, melded with often dubious “objective” data, to generate a “global ranking”. This methodology is very much open to question on several grounds and, interestingly, is not one that is followed by the major national rankings conducted in the US and Canada (but is followed by national rankings conducted in Australia by, you guessed it, one of Rupert’s national newspapers).

In general, the game of ranking universities is one that is fraught with problems of measurement and interpretation (the measures used are, inevitably, proxies that often do not stand up well to close scrutiny, and are open to different interpretation in different places – e.g., at least one English-speaking EU university is known to have added in graduate students to its total number of faculty, thereby improving its staff/student ratio). A cursory glance at how the THES scores fall shows that an anomalous result in a single category (e.g., a high % of “foreign” faculty, or a high % of “foreign” students) can shift the rank of the university by many tens of places (and, indeed, a large % of non-national students, as in many Australian universities, may merely reflect relatively low academic admission standards for full-fee-paying international students, in an attempt by universities to make up for chronic government underfunding: watch for this particular game in Ireland in the near future).

A better job is done by the Shanghai Jiao Tong University’s global ranking exercise which, outside the UK – and perhaps Ireland and Australia – appears to have more credibility than the THES exercise. A methodological difference between the Shanghai rankings and the THES rankings is the use of perception. Measures of perception are a mainstay of the THES exercise only and, for an exercise that is designed to sell newspapers, has the beneficial effect that big differences in an individual university’s rank, especially outside the top positions, can and does occur from one year to the next (perceptual measures of this type are volatile). Where “objective” measures only are used (or where they overwhelmingly dominate), annual shifts are small, such as with the US “US News” annual rankings and the Canadian “Macleans” annual rankings. Another difference between the Shanghai Jiao Tong exercise and the THES exercise is that the Shanghai measure is considerably less anglo-centric.

It may be the case that the Shanghai Jiao Tong rankings gain less coverage in Ireland than the THES rankings because of the proximity of the UK (Hugh Brady, for example, has said that THES rankings are important, despite what one may think of them), or it may be because Irish universities fair much less well in the Shanghai Jiao Tong rankings, with Trinity ranked globally in the range 201-302 UCD placed in the range 303-401 (not much “imperfect rank correlating with imperfect rank” there).

In my own view, as someone who has worked in universities in North America, the UK, Asia, Australia, and Ireland, the Shanghai Jiao Tong rankings concur more with what I see than do the THES rankings, but of course mine is both a limited and biased view. Ultimately, however, it is clear to me that university rankings of the Shanghai type are close to meaningless (apples and oranges, proxies, measurement error, regional to global transformation, noise, etc.), with the THES rankings even closer. This is especially so in the context of seeing a meaningful difference between a ranking of, say, 89, vrs a ranking of, say, >100, or even >200.

Note that I am not associated with any university in this country, and that the people above who express skepticism are.

I do not like the Shanghai ranking one bit. It puts a high weight on publications in Science and Nature, but little to no weight on top disciplinary journals. It puts a high weight on Nobel Laureates, but ignores other prizes. The disciplinary bias is therefore worse than in THE-QS.


Not sure about the relevance but Murdoch/News International does not own the education titles (TES, THES).

‘unwarrented self congratulation’ – moi! No, I agree that this is nothing to get too excited about in terms of my daily life BUT it does make a difference whether we like it or not – this has been headline news here in terms of the radio/print media at a time when the debate on funding is hot.

@Michael, @Richard
Shanghai seems just as silly as the THES ones. Giving TCD weight for Walton is plain daft.

THe fact remains that the top two in Ireland are low on citations and low on staff/student ratios and that hurts them both – two elements that could get worse over the coming few years.

Anyway – we all look at REPEC, don’t we?!!

@Colm: yes, *we* look at REPEC. How many disciplines have such metrics I wonder?

And a simple question which this suggests is: in how many disciplines can either TCD or UCD say that they are in the top 100 worldwide? Even once the rankings have been corrected for size?

The answer to that question, whatever it is, will surely back your general point. These high rankings don’t reflect research, but something fuzzier. And, since part of what universities are supposed to do is to make their undergraduates employable, for whatever reason, I guess that is OK as far as it goes. One just shouldn’t base public policy on such numbers without understanding exactly what they mean.

@Colm Harmon
“Murdoch/News International does not own the education titles (TES, THES).”

Colm, you’re right. Wiki snippet below indicates that THES (now THE) left the Murdoch stable in late 2005. What can I say: I’m getting old and I make mistakes … Anyhow, I’ve been looking at the THES (now THE-QS) rankings off and on since 2004, and I think I have a fairly good understanding of their profound weaknesses. Spurious quantification would be my summary view.

In Australia, where they also are taken very seriously by V/Cs, year-on-year improved rankings in the THE-QS rankings had even become part of at least one V/C’s annual salary-bonus structure (Stevie Schwartz at Macquarie in Sydney). Interestingly, Schwartz recently publicly swapped his allegiance from THE-QS to Shanghai Jiao Tong. That Macquarie recently went down a large number of places in the former, but has gone up in the latter, would, of course, have nothing to do with this realignment.

From Wikipedia:
From its first edition, in 1971, to 2008, the THES was published in newspaper format. On 10 January 2008 the publication was relaunched as a magazine which continues currently. It is published by TSL Education Ltd., which was, until October 2005, a division of News International.

* Richard, Kevin

RePEC – get rid of Markusen, and Honohan should do the decent thing and take himself of it to give us a chance. But seriously, RePEC is an amazing public good for the economics community and very useful. I like how economics took it upon itself to create this open and very public forum as other social science disciplines choke at the idea in my experience. And extending a point, Kevin, but one thing that RePEC says, in conjunction with the THES and others, is that Economics is hitting as a discipline about where the institutions are collectively, but at a much lower cost per student/paper/citation. Overall, looking at the THES and other rankings like RePEC, I felt we were in company that I would expect us to be with. I would assume that UCD is a bit like Nottingham, for example. So these are settling into what one might think or feel from working here. That is not necessarily a compliment or ‘self congratulation’ by the way!

I am struck by how the Oz Higher Ed folks were focused on the rankings. VC’s chase them like mad. But in fairness they are showing up well as institutions, and the Government are responding with enormous ‘stimulus’ investment. I think Australian institutions are perceived as sleepy by others but they will catch the asian wave and if they get good faculty hires there they will take off. The citation scores of institutions like Adelaide are way over the Irish two for example.


There is a flaw at the heart of the THE-QS rankings that very likely explains Ireland’s apparently amazing performance of having two “top-100” universities with the population of a small American state and fraction its budget. In the THE-QS exercise local-perception ratings contribute 50% (employer reviews: 10%, academic peer reviews: 40%) to a university’s score, which is then used to rank the university globally. No amount of massaging of these locally sourced “review” scores which will focus on local universities can result in a seriously meaningful global comparison, especially beyond the first dozen or so rankings. “Good” universities in small regions (populationwise) will be favoured; “good” universities in regions that have few “good” universities will be favoured; “good” universities in regions that have large numbers of “good” and “very good” universities will be disfavoured.

Alan Gilbert, then V/C of Melbourne, once said that no Australian university would rank in the top 100 in the world, and possibly not even in the top 200. This pronouncement was not considered as particularly controversial by faculty who had substantial experience in the EU and/or North America, though most thought that Melbourne may get into the top 200. Then came the THES rankings, and suddenly almost every – and now every – “Group of Eight” research university in Australia is in the world’s top 100. This is a nonsense that comes about for the same reasons as Ireland apparently having two such universities. Most states and provinces in North America alone have one university that by any measure you’d care to make outrank Australia’s best, and many states have more than one, some more than two. Many European countries also have more than one, and a fair number have more than that.

My argument with rankings that place no fewer than eight Australian universities in the “world’s top 100”, and two Irish universities in there as well, can be summarized as this: the methodology is flawed – local measures cannot be extrapolated to provide global comparison – and simple experience of universities in the the US, Canada, the EU, Japan and the rest of Asia, which many of us have, cannot do anything but lead to the conclusion that these figures are idiotic. This is not to say, by the way, that Ireland and Australia do not each have many good universities.

I hope your analysis is correct re the future for Australian universities (“perceived as sleepy” – I like that!), though I hear from Sydney this week that the Australian Federal Government is using the impact of the “global financial crisis” on Australia as an excuse to talk about cuts to universities. As for the ‘stimulus’ money – and yes, Australia’s lack of a banking crisis and budget surplus meant a stimulus was relatively easy to implement – a fair amount of it did go to education, but not that much to universities. The reality for Australian universities is that cuts in funding, especially on a per student basis, have been ongoing since the late 1970’s, with a few brief interregna, and the current Labor Federal Government is not in much of a mind to redress the historical deficit.

@M Moore
“Irish universities emerge as what they are: irredeemably provincial”
not you nor me Michael, nor indeed any active researcher I know of in any university who pretty universally have an international perspective

Scientists, even social scientists, put a lot of thought and effort into measurement in their research. And then there are these rankings which show little evidence of either. In the psychometric literature, there a bunch of criteria by which different instruments are assessed like external validity, test-retest bias. I would like to see these rankings discussed in a more sober, systematic way.
For example, what does it mean for a university to rise or fall, say, 40 places in a year? In my experience universities don’t change much from year to year so such leaps makes these rankings dubious.
Overall, I find these discussions rather juvenile (“mine is bigger than yours”) and the institutional crowing over places to be embarrassing.

I would think that THE-QS passes face validity. TCD and UCD are the two top universities in Ireland, and they are part of a broad international subtop. Do you think this is not true?

For your information, there are two specialist journals (Scientometrics, J Informetrics) devoted to this topic, and a number of others that regularly publish papers in this area. Iulia highlights the two most academic (and least used) rankings:

Regardless of the quality of these rankings, they are important as they shape the reputation of universities in the minds of politicians, investors, and students.

I can hardly claim to be objective about UCD. But yeah UCD, TCD are probably the best here. But one cannot infer anything from n=2. There may well be other comparisons which are totally absurd. I don’t know what it means to say that UCL is better than Oxford (2 institutions I know somewhat) since they are fundamentally different types of places. On the other hand the statement that Oxford is better than Coventry, while true, is not terribly interesting: its bleedin’ obvious, innit?

These rankings may be important. While they get a lot of attention, how much they actually affect decisions I don’t know. Since they certainly wouldn’t affect my decisions, I am somewhat skeptical. My point is that I am not convinced they should be important.

Its good to know that there is some serious thought behind these measurements but does it have much impact? Most academics have probably never heard of the journals.


Don’t we teach our students that, in the social sciences, introspection is not a valid method? Just because you don’t know these journals, the majority does not know them? Just because your decisions are not influenced, no one’s decisions are influenced?

Yesterday, the server that hosts THE-SQ was creaking under the weight of the traffic. Every Chinese parent thinking of sending their beloved child to an English-speaking university knows that TCD outranks UCD.

“So why are the arts and humanities so undervalued by Irish university administrators?”

I’m not convinced that they are undervalued, considering the preponderance of Arts & Humanities students at these institutions. I’m in Engineering (3Y) at UCD, which feels like a desert island of competence in a sea of “kids who hate Maths”. While I’m generally supportive of free fees, I can’t help but think that it’s partly responsible for a generation of students who attend university as an extension of school, without a thought for what they are going to do with a BA under the belt.

So, if TCD outperforms UCD in Arts & Humanities, I can only help that that means UCD outperforms TCD in the Sciences, eh?

I am not sure what you are trying to say. That the RAE is no good? (This is irrelevant to the current discussion.) That excellence is unidimensionally measured in Nobel Prizes? (There are four worthy disciplines only?)

It is a great achievement to be ranked in the world top 100. The rankings take into a combination of factors including academic peer review, employer survey, research citations and student/staff ratios. QS is the most popular ranking table.


Patronage from the incompetent, no less.

Your achievement is far greater than that of any university, and it is this: you are not dismissed out of hand by everyone as snake-oil salesmen (and women). Indeed, your rankings are popular … with a sizable minority of university presidents in Ireland, Australia, and the UK. Odd, is it not, that so many universities from Ireland, Australia, and the UK, do remarkably well in your rankings.

Comments are closed.