The impact of the PRTLI

The HEA has published its impact analysis of the Programme for Research in Third Level Institutions (h/t Colm Harmon). It is good that government agencies are increasingly open to such evaluation.

From the executive summary, we learn that PRTLI centres and initiatives had a budget of 1.7 bln euro, with 1.2 bln from the Exchequer. 1,700 people were employed, at an exchequer cost of 700,000 euro per job. In 1998, 2,400 full-time academics were employed at the universities and ITs. In 2008, there were 6,200 FTEs.

The commercial impact (a mix of turnover, investment, and cost savings) was 750 million euro, with 1,300 jobs created (or 600,000 euro per job). For the next five years, a further impact of 1.1 bln euro is projected.

In the foreword, John Hennessy (the HEA chairperson) puts on a brave face and lists all the benefits that were not quantified.

Intrigued by the numbers (and their precision; above numbers are rounded) in the executive summary, I read on expecting to find tables and tables with detailed data that would tell me who publishes and who gets cited, which disciplines create economic value, and what universities are motors of development. Unfortunately, such data is not available. The data, by the way, were gathered by questionnaire — that is, companies were asked how many people they additionally employed because of the PRTLI.

Some evaluation is better than no evaluation, but I think that a 1.2 bln euro investment warrants more analysis than what is offered by the HEA.

Do college rankings matter for student choice?

A guest post by Kevin Denny

The recent publication of the QS world rankings generated a lot of interest as well as criticism from various people, including me. A common response to such criticisms is to say “Like it or not, they matter to people so we need to pay attention to them”.  But do they matter?

This paper looks at how the publication of US college rankings influences the demand for places but only when colleges are not listed alphabetically.

Salience in Quality Disclosure: Evidence from the U.S. News College Rankings

M. Luca & J. Smith

How do rankings affect demand? This paper investigates the impact of college rankings, and the visibility of those rankings, on students’ application decisions. Using natural experiments from U.S. News and World Report College Rankings, we present two main findings. First, we identify a causal impact of rankings on application decisions. When explicit rankings of colleges are published in U.S. News, a one-rank improvement leads to a 1-percentage-point increase in the number of applications to that college. Second, we show that the response to the information represented in rankings depends on the way in which that information is presented. Rankings have no effect on application decisions when colleges are listed alphabetically, even when readers are provided data on college quality and the methodology used to calculate rankings. This finding provides evidence that the salience of information is a central determinant of a firm’s demand function, even for purchases as large as college attendance.

[NOTE: The “rankings” tag leads to previous posts on this topic.]

UPDATE: Glenn Ellison has a cool paper that’s related:

Abstract:

A large literature following Hirsch (2005) has proposed citation-based indexes that could be used to rank academics. This paper examines how well several such indexes match labor market outcomes using data on the citation records of young tenured economists at 25 U.S. departments. Variants of Hirsch’s index that emphasize smaller numbers of highly-cited papers perform better than Hirsch’s original index and have substantial power to explain which economists are tenured at which departments. Adjustment factors for differences across fields and years of experience are presented.

Higher education and research

Higher education and research was again in the news today.

The latest batch of bad news on the labour market in Waterford seems to have triggered a decision to establish Waterford University. I am not convinced that universities are necessarily good for regional development. Some universities sure have a positive impact, but I don’t think this holds for any university. With the newly build highways, Waterford is closer to Cork and Dublin, taking away some of the would-be benefits of a local centre of learning and research.

Furthermore, Ireland has plenty of universities already. The largest university has 18,000 students (UCD, 2009) — which puts it below average in the Netherlands,  60th in the UK,  38th in Germany, 35th and just above average in France. Ireland has the 8th highest number of universities per capita in the world already. (A new university would not change the latter rank, just push us closer to Norway.) This matters for two reasons. There is a fixed cost in running a university. International rankings are not normalized for size; small universities cannot do well.

The 2010 annual report of Science Foundation Ireland also made the news today. The press release emphasizes collaboration, which has increased with both researchers abroad and companies in Ireland. This is not a measure of success. It may just reflect the changing nature of SFI funding and its increase in size. The annual report itself has more indicators, but is annoyingly glossy for an academic organization. We learn that SFI-funded researchers have published 22% more papers in 2010 than in 2009, but we are not told the number of researchers. We learn that Ireland has gone up 16 places in the citations-per-paper ranking (36th in 2003, 20th in 2010), but for all we know that may be because of the social sciences and humanities (who are not supported by SFI).

The SFI 2010 Census has more numbers. Two things stand out: Few patents, few spin-outs. Emigration numbers are high: 47% for all, 66% for non-Irish (post-doc and below). SFI’s mission is to bolster innovation in Irish manufacturing.

Ireland’s economists in the world (part 2)

I’m getting better at scraping the web and I’ve now been able to calculate some things that IDEAS/RePEc does not.

This graph has the number of economists in Ireland registered at IDEAS/RePEc. It is not a natural number to account for joint appointments. The number has been rising steadily over time. I expect that trend to reverse in the coming months.

This graph shows Ireland’s position in the total population of economists. We’re a small country. I highlight Massachusetts because it is ranked highest by IDEAS/RePEc.

This graph shows the number of unique publications per person. In recent times, Ireland has done reasonably well in terms of productivity.

However, visibility cq interest is less impressive, as shown in this graph. It should be noted, though, that “abstract views” is the metric that can be most easily manipulated. That said, Ireland does not do so well either on the number of citations per publication, as shown in this graph, or on the number of citing authors per person, as shown in this graph.

As always, these results can be interpreted in a number of ways. In order to improve Ireland’s standing at IDEAS/RePEc, we’ll need to convince more people that our papers are worth citing.

The great thing about the Public Data Explorer is that you can make your own graphs. You need to go back two positions to return to this blog.

I used these Matlab scripts to scrape the web.

The best university in all the land

The new QS rankings are out: TCD tops the Irish poll at 65, followed by UCD at 134, UCC at 181, and NUIG at 298. Ireland’s other universities are not ranked in the top 300.

The Examiner (and RTE radio) made much of the fact that UCC got 5 stars. QS now has two rankings. The new one requires more data from the universities. To date, only U Limerick (4 stars) and UCC have provided that information. UCC is thus best of two.

There are disciplinary rankings too. In economics, TCD and UCD are both 51-100. Other universities do not make it into the top 200.

The Independent and Times note that Ireland’s universities have been sliding down the QS rankings. If I’m not mistaken, QS ranks can be compared over time whereas THE ranks cannot. The reasons offered by the various people interviewed are, of course, just speculation. The QS data do not allow for an in-depth analysis of the reasons behind the success, and Ireland’s universities are not particularly good in keeping records.

20% of the QS ranking is citations per faculty. QS does not define this, but the practicable way is to allocate papers to the university at which the research was done (rather than where the researcher is now). Faculty numbers have fallen, so Ireland’s position should have improved on this score, partly offsetting the decline in the faculty-student ratio (another 20%). 50% of the QS ranking is based on “reputation”, and that’s a stock variable that should survive a downturn if properly measured. However, I would think that the drop in ranking is at least partly explained by the brand Ireland turning sour in general.

UPDATE: Brian Lucey offers further thoughts and data.

UPDATE2: Kevin Denny is not impressed.