The ESRC Annual Report for 2013-14 has been out for quite a while now, and a quick summary and analysis from me is long overdue.
Although I was tempted to skip straight through all of the good news stories about ESRC successes and investments and dive straight in looking for success rates, I’m glad I took the time to at least skim read some of the earlier stuff. When you’re involved in the minutiae of supporting research, it’s sometimes easy to miss the big picture of all the great stuff that’s being produced by social science researchers and supported by the ESRC. Chapeau, everyone.
In terms of interesting policy stuff, it’s great to read that the “Urgency Grants” mechanism for rapid responses to “rare or unforeseen events” which I’ve blogged about before is being used, and has funded work “on the Philippines typhoon, UK floods, and the Syrian crisis”. While I’ve not been involved in supporting an Urgency Grant application, it’s great to know that the mechanism is there, that it works, and that at least some projects have been funded.
The “demand management” agenda
This is what the report has to say on “demand management” – the concerted effort to reduce the number of applications submitted, so as to increase the success rates and (more importantly) reduce the wasted effort of writing and reviewing applications with little realistic chance of success.
Progress remains positive with an overall reduction in application numbers of 41 per cent, close to our target of 50 per cent. Success rates have also increased to 31 per cent, comparable with our RCUK partners. The overall quality of applications is up, whilst peer review requirements are down.
There are, however, signs that this positive momentum maybe under threat as in certain schemes application volume isbeginning to rise once again. For example, in the ResearchGrants scheme the proposal count has recently exceededpre-demand management levels. It is critical that all HEIscontinue to build upon early successes, maintaining thedownward pressure on the submission of applications acrossall schemes.
It was always likely that “demand management” might be the victim of its own success – as success rates creep up again, getting a grant appears more likely and so researchers and research managers encourage and submit more applications. Other factors might also be involved – the stage of the REF cycle, for example. Or perhaps now talk of researcher or institutional sanctions has faded away, there’s less incentive for restraint.
Another possibility is that some universities haven’t yet got the message or don’t think it applies to them. It’s also not hard to imagine that the kinds of internal review mechanisms that some of us have had for years and that we’re all now supposed to have are focusing on improving the quality of applications, rather than filtering out uncompetitive ideas. But is anyone disgracing themselves?
Looking down the list of successes by institution (p. 41) it’s hard to pick out any obvious bad behaviour. Most of those who’ve submitted more than 10 applications have an above-average success rate. You’d only really pick out Leeds (10 applications, none funded), Edinburgh (8/1) and Southampton (14/2), and a clutch of institutions on 5/0, (including top-funded Essex, surprisingly) but in all those cases one or two more successes would change the picture. Similarly for the top performers – Kings College (7/3), King Leicester III (9/4), Oxford (14/6) – hard to make much of a case for the excellence or inadequacy of internal peer review systems from these figures alone. What might be more interesting is a list of applications by institution which failed to reach the required minimum standard, but that’s not been made public to the best of my knowledge. And of course, all these figures only refer to the response mode Standard Grant applications in the financial year (not academic year) 2013-14.
Concentration of Funding
Another interesting stat (well, true for some values of “interesting”) concerns the level of concentration of funding. The report records the expenditure levels for the top eleven (why 11, no idea…) institutions by research expenditure and by training expenditure. Interesting question for you… what percentage of the total expenditure do the top 11 institutions get? I could tell you, but if I tell you without making you guess first, it’ll just confirm what you already think about concentration of funding. So I’m only going to tell you that (unsurprisingly) training expenditure is more concentrated than research funding. The figures you can look up for yourself. Go on, have a guess, go and check (p. 44) and see how close you are.
Research Funding by Discipline
On page 40, and usually the most interesting/contentious. Overall success rate was 25% – a little down from last year, but a huge improvement on 14% two years ago.
Big winners? History (4 from 6); Linguistics (5 from 9), social anthropology (4 from 9), Political and International Studies (9 from 22), and Psychology (26 from 88, – just under 30% of all grants funded were in psychology). Big losers? Education (1 from 27), Human Geography (1 from 19), Management and Business Studies (2 from 22).
Has this changed much from previous years? Well, you can read what I said last year and the year before on this, but overall it’s hard to say because we’re talking about relatively small numbers for most subjects, and because some discipline classifications have changed over the last few years. But, once again, for the third year in a row, Business and Management and Education do very, very poorly.
Human Geography has also had a below average success rate for the last few years, but going from 1 in 19 from 3 from 14 probably isn’t that dramatic a collapse – though it’s certainly a bad year. I always make a point of trying to be nice about Human Geography, because I suspect they know where I live. Where all of us live. Oh, and Psychology gets a huge slice of the overall funding, albeit not a disproportionate one given the number of applications.
Which kinds of brings us back to the same questions I asked in my most-read-ever piece – what on earth is going on with Education and Business and management research, and why do they do so badly with the ESRC? I still don’t have an entirely satisfactory answer.
I’ve put together a table showing changes to disciplinary success rates over the last few years which I’m happy to share, but you’ll have to email me for a copy. I’ve not uploaded it here because I need to check it again with fresh eyes before it’s used – fiddly, all those tables and numbers.