ESRC success rates by discipline for 2012-13

Update: 2013/14 figures here.

WA pot of gold at the end of a rainbowith all of the fanfare of a cat-burglar slipping in through a first floor window in back office of a diamond museum, the ESRC has published its Vital Statistics for 2012-13, including the success rates by academic discipline.  I’ve been looking forward to seeing these figures to see if there’s been any change since last year’s figures, which showed huge variations in success rates between different disciplines, with success rates varying from 1 in 68 for Business and Management and 2 in 62 for Education compared to 7 of 18 for socio-legal studies.

The headline news, as trumpeted in the Times Higher, is that success rates are indeed up, and that “demand management” appears to be working.  Their table shows how applications, amount of money distributed, and success rates have varied over the last few years, and has figures for all of the research councils.  For the ESRC, the numbers in their Vital Statistics document are slightly different (315 applications, 27% success rate) to those in the Times Higher table (310, 26%) , possibly because some non-university recipients have been excluded.  The overall picture is hugely encouraging and is a great improvement on 14% success rates last year.  And it’s also worth repeating that these figures don’t seem to include the Knowledge Exchange scheme, which now has a 52% success rate.  This success rate is apparently too high, as the scheme is going to end in March next year to be replaced with a scheme of passing funding directly to institutions based on their ESRC funding record – similar to the EPSRC scheme which also delegates responsibility for running impact/knowledge exchange schemes to universities.

For the ESRC, “demand management” measures so far have largely consisted of:
(i) Telling universities to stop submitting crap applications (I paraphrase, obviously…..)
(ii) Telling universities that they have to have some kind of internal peer review process
(iii) Threatening some kind of researcher sanctions if (i) and (ii) don’t do the trick.

And the message appears to have been getting through.  Though I do wonder how much of this gain is through eliminating “small” research grants – up to £100k – which I think in recent times had a worse success rate than Standard Grants, though that wasn’t always the case historically.  Although it’s more work to process and review applications for four pots of 100k than for one of 400k, the loss of Standard Grants is to be regretted, as it’s now very difficult indeed to get funding for social science projects with a natural size of £20k-£199k.

But what you’re probably wondering is how your academic discipline got on this time round.  Well, you can find this year’s and last year’s Vital Statistics documents hidden away in a part of the ESRC’s website that even I struggle to find, and I’ve collated them for easy comparison purposes here.  But the figures aren’t comparing like with like – the 2011/12 figures included the last six months of the old Small Grants Scheme, which distorts things.  It’s also difficult (obviously) to make judgements based on small numbers which probably aren’t statistically significant. Also, in the 2011-12 figures there were 43 applications (about 6% of the total) which were flagged as “no lead discipline”, which isn’t a category this year.  But some overall trends have emerged:

  • Socio-legal Studies (7 from 18, 3 from 8), Linguistics (6 from 27, 5 from 15) and Social Anthropology (5 from 18, 4 from 5) have done significantly better than the average for the last two years
  • Business and Management (1 from 68, 2 from 17) and Education (2 from 62, 2 from 19) continue to do very poorly.
  • Economics and Economics and Social History did very well the year before last, but much less well this year.
  • Psychology got one-third of all the successes last year, and over a quarter the year before, though the success rate is only very slightly above average in both years.
  • No projects in the last two years funded from Environmental Planning or Science and Technology Studies
  • Demography (2 from 2) and Social Work (3 from 6) have their first projects funded since 2009/10.

Last year I speculated briefly about what the causes of these differences might be and looked at success rates in previous years, and much of that is still relevant.  Although we should welcome the overall rise in success rates, it’s still the case that some academic subjects do consistently better than others with the ESRC.  While we shouldn’t expect to see exactly even success rates, when some consistently outperform the average, and some under-perform, we ought to wonder why that is.

Meanwhile, over at the ESRC…

There have been a few noteworthy developments at the ESRC over the summer months which I think are probably worth drawing together into a single blog post for those (like me) who’ve made the tactical error of choosing to have some time off over the summer.

1.  The annual report

I’ve been looking forward to this (I know, I know….) to see whether there’s been any substantial change to the huge differences in success rates between different academic disciplines.  I wrote a post about this back in October and it’s by some distance the most read article on my blog. Has there been any improvements since 2011/12, when Business and Management had 1 of 68 applications funded and Education 2 of 62, compared to Socio-Legal Studies (39%, 7 of 18), and Social Anthropology (28%, 5 from 18).

Sadly, we still don’t know, because this information is nowhere to be found in the annual report. We know the expenditure by region and the top 11 (sic) recipients of research expenditure, research and training expenditure, and the two combined.  But we don’t know how this breaks down by subject.  To be fair, that information wasn’t published until October last year, and so presumably it will be forthcoming.  And presumably the picture will be better this year.

That’s not to say that there’s no useful information in the annual report. We learn that the ESRC Knowledge Exchange Scheme has a very healthy success rate of 52%, though I think I’m right in saying that the scheme will have been through a number of variations in the period in question. Historically it’s not been an easy scheme to apply for, partly because of the need for co-funding from research partners, and partly because of a number of very grey areas around costing rules.

For the main Research Grants Scheme success rates are also up, though by how much is unclear.  The text of the report (p. 18) states that

After a period where rates plummeted to as low as 11 per cent, they have now risen to 35 per cent, in part because we have committed additional funding to the scheme [presumably through reallocation, rather than new money] but also because application volume has decreased. This shows the effects of our demand management strategy, with HEIs now systematically quality assuring their applications and filtering out those which are not ready for submission. We would encourage HEIs to continue to develop their demand management strategies as this means academics and administrators in both HEIs and the ESRC have been able to focus efforts on processing and peer-reviewing a smaller number of good quality applications, rather than spending time on poor quality proposals which have no chance of being funded.

Oddly the accompanying table gives a 27% success rate, and unfortunately (at the time of writing) the document with success rates for individual panel meetings hasn’t been updated since April 2012, and the individual panel meeting documents only list funded projects, not success rates. But whatever the success rate is, it does appear to be a sign that “demand management” is working and that institutions are practising restraint in their application habits.  Success rates of between a quarter and a third sound about right to me – enough applications to allow choice, but not so many as to be a criminal waste of time and effort.

The report also contains statistics about the attendance of members at Council and Audit Committee Meetings, but you’ll have to look them up for yourself as I have a strict “no spoilers” policy on this blog.

I very much look forward – and I think the research community is too – to seeing the success rates by academic discipline at a later date.

2. A new Urgency Grants Mechanism

More good news…. a means by which research funding decisions can be taken quickly in response to the unexpected and significant.  The example given is the Riots of summer 2011, and I remember thinking that someone would get a grant out of all this as I watched TV pictures my former stomping ground of Croydon burn.  But presumably less… explosive unexpected opportunities might arise too.  All this seems only sensible, and allows a way for urgent requests to be considered in a timely and transparent manner.

3. ESRC Future Research Leaders call

But “sensible” isn’t a word I’d apply to the timing of this latest call.  First you’ve heard of it?  Well, better get your skates on because the deadline is the 24th September. Outline applications?  Expressions of interest?  Nope, a full application.  And in all likelihood, you should probably take your skates off again because chances are that your institution’s internal deadlines for internal peer review have already been and gone.

The call came out on or about the 23rd July, with a deadline of 24th September. Notwithstanding what I’ve said previously about no time of the academic year being a good time to get anything done, it’s very hard to understand why this happened.  Surely the ESRC know that August/September is when a lot of academic staff (and therefore research support) are away from the university on a mixture of annual leave and undertaking research.  Somehow, institutions are expected to cobble together a process of internal review and institutional support, and individuals are expected to find time to write the application.  It’s hard enough for the academics to write the applications, but if we take the demand management agenda seriously, we should be looking at both the track record and the proposed project of potential applicants, thinking seriously about mentoring and support, and having difficult conversations with people we don’t think are ready.  That needs a lot of senior time, and a lot of research management time.

This scheme is a substantial investment.  Effectively 70 projects worth up to £250k (at 80% fEC).  This is a major investment, and given that the Small Grants scheme and British Academy Fellowship success rates are tiny, this is really the major opportunity to be PI on a substantial project.  This scheme is overtly picking research leaders of the future, but the timetable means that it’s picking those leaders from those who didn’t have holiday booked in the wrong couple of weeks, or who could clear their diaries to write the application, or who don’t have a ton of teaching to prepare for – which is most early career academics, I would imagine.

Now it might be objected that we should have know that the call was coming.  Well…. yes and no. The timing was similar last year, and it was tight then, but it’s worse this year – it was announced on about the same date, but with a deadline 4th October, almost two working weeks later.  Two working weeks that turns it from a tall order into something nigh on impossible, and which can only favour those with lighter workloads in the run-up to the new academic year. And even knowing that it’s probably coming doesn’t help.  Do we really expect people to start making holiday plans around when a particular call might come out?  Really?  If we must have a September deadline, can we know about it in January?  Or even earlier?  To be fair, the ESRC has got much better with pre-call announcements of late, at least for very narrow schemes, but this really isn’t good enough.

I also have a recollection (backed up by a quick search through old emails, but not by documentary evidence) that last year the ESRC were talking about changing the scheme for this year, possibly with multiple deadlines or even going open call.  Surely, I remember thinking, this start-of-year madness can only be a one-off.

Apparently not.

Is there a danger that research funding calls are getting too narrow?

The ESRC have recently added a little more detail to a previous announcement about a pending call for European-Chinese joint research projects on Green Economy and Population Change.  Specifically, they’re after projects which address the following themes:

Green Economy

  • The ‘greenness and dynamics of economies’
  • Institutions, Policies and planning for a green economy
  • The green economy in cities and metropolitan areas
  • Consumer behaviour and lifestyles in a green economy

Understanding population Change

  • changing life course
  • urbanisation and migration
  • labour markets and social security dynamics
  • methodology, modelling and forecasting
  • care provision
  • comparative policy learning

Projects will need to involve institutions from at least two of the participating European counties (UK, France (involvement TBC), Germany, Netherlands) and two institutions in China. On top of this is an expectation that there will be sustainability/capacity building around the research collaborations, plus the usual further plus points of involving stakeholders and interdisciplinary research.

Before I start being negative, or potentially negative, I have one blatant plug and some positive things to say. The blatant plug is that the University of Nottingham has a campus in Ningbo in China which is eligible for NSFC funding and therefore would presumably count as one Chinese partner. I wouldn’t claim to know all about all aspects of our Ningbo research expertise, but I know people who do.  Please feel free to contact me with ideas/research agendas and I’ll see if I can put you in touch with people who know people.

The positive things.  The topics seem to me to be important, and we’ve been given advance notice of the call and a fair amount of time to put something together.  There’s a reference to Open Research Area procedures and mechanisms, which refers to agreements between the UK, France, Netherlands and Germany on a common decision making process for joint projects in which each partner is funded by their national funder under their own national funding rules.  This is excellent, as it doesn’t require anyone to become an expert in another country’s national funder’s rules, and doesn’t have the double or treble jeopardy problem of previous calls where decisions were taken by individual funders.  It’s also good that national funders are working together on common challenges – this adds fresh insight, invites interesting comparative work and pools intellectual and financial resources.

However, what concerns me about calls like this is that the area at the centre of the particular Venn diagram of this call is really quite small.  It’s open to researchers with research interests in the right areas, with collaborators in the right European countries, with collaborators in China.   That’s two – arguably three – circles in the diagram.  Of course, there’s a fourth – proposals that are outstanding.  Will there be enough strong competition on the hallowed ground at the centre of all these circles? It’s hard to say, as we don’t know yet how much money is available.

I’m all for calls that encourage, incentivise, and facilitate international research.  I’m in favour of calls on specific topics which are under-researched, which are judged of particular national or international importance, or where co-funding from partners can be found to address areas of common interest.

But I’m less sure about having both in one call – both very specific requirements in terms of the nationality of the partner institutions, and in terms of the call themes. Probably the scope of this call is wide enough – presumably the funders think so – but I can’t help think that that less onerous eligibility requirements in terms of partners could lead to greater numbers of high quality applications.

ESRC “demand management” measures working….. and why rises and falls in institutions’ levels of research funding are not news

There was an interesting snippet of information in an article in this week’s Times Higher about the latest research council success rates.

 [A] spokeswoman for the ESRC said that since the research council had begun requiring institutions from June 2011 to internally sift applications before submitting them, it had recorded an overall success rate of 24 per cent, rising to 33 per cent for its most recent round of responsive mode grants.  She said that application volumes had also dropped by 37 per cent, “which is an encouraging start towards our demand management target of a 50 per cent reduction” by the end of 2014-15.

Back in October last year I noticed what I thought was a change in tone from the ESRC which gave the impression that they were more confident that institutions had taken note of the shot across the bows of the “demand management” measures consultation exercise(s), and that perhaps asking for greater restraint in putting forward applications would be sufficient.  I hope it is, because as the current formal demand management proposals that will be implemented if required unfairly and unreasonably include co-applicants in any sanction.

I’ve written before (and others have added very interesting comments) about how I think we arrived at the situation where social science research units were flinging as many applications in as possible in the hope that some of them would stick.  And I hope the recent improvements in success rates to around 1-in-3, 1-in-4 don’t serve to re-encourage this kind of behaviour. We need long term, sustainable, careful, restraint in terms of what applications are submitted by institutions to the ESRC (and other major funders, for that matter) and the state in which they’re submitted.

Everyone will want to improve the quality of applications, and internal mentoring and peer review and the kind of lay review that I do will assist with that, but we also need to make sure that the underlying research idea is what I call ‘ESRC-able’.  At Nottingham University Business School, I secured agreement a while ago now to introduce a ‘proof of concept’ review phase for ESRC applications, where we review a two page outline first, before deciding whether to give the green light for the development of a full application.  I think this allows time for changes to be made at the earliest stage, and makes it much easier for us to say that the idea isn’t right and shouldn’t be developed than if a full application was in front of us.

And what isn’t ‘ESRC-able’?  I think a look at the assessment schema gives some useful clues – if you can’t honestly say that your application would fit in the top two categories on the final page, you probably shouldn’t bother.  ‘Dull but worthy’ stuff won’t get funded, and I’ve seen the phrase “incremental progress” used in referees’ comments to damn with faint praise.  There’s now a whole category of research that is of good quality and would doubtless score respectably in any REF exercise, but which simply won’t be competitive with the ESRC.  This, of course, raises the question about how non-groundbreaking stuff gets funded – the stuff that’s more than a series of footnotes to Plato, but which builds on and advances the findings of ground-breaking research by others.  And to that I have no answer – we have a system which craves the theoretically and methodologically innovative, but after a paradigm has been shifted, there’s no money available to explore the consequences.

*     *     *     *     *

Also in the Times Higher this week is the kind of story that appears every year – some universities have done better this year at getting research funding/with their success rates than in previous years, and some have done worse.  Some of those who have done better and worse are the traditional big players, and some are in the chasing pack.  Those who have done well credit their brilliant internal systems and those who have done badly will contest the figures or point to extenuating circumstances, such as the ending of large grants.

While one always wants to see one’s own institution doing well and doing better, and everyone always enjoys a good bit of schadenfreude at the expense of their rivals benchmark institutions and any apparent difficulties that a big beast find themselves in, are any of these short term variations of actual, real, statistical significance?  Apparently big gains can be down to a combination of a few big wins, grants transferring in with new staff, and just… well… the kind of natural variation you’d expect to see.  Big losses could be big grants ending, staff moving on, and – again – natural variance.  Yes, you could ascribe your big gains to your shiny new review processes, but would you also conclude that there’s a problem with those same processes and people the year after when performance is apparently less good?

Why these short term (and mostly meaningless) short term variations are more newsworthy than the radical variation in ESRC success rates for different social science disciplines I have no idea….

ESRC success rates by discipline: what on earth is going on?

Update – read this post for the 2012/13 stats for success rates by discipline

The ESRC have recently published a set of ‘vital statistics‘ which are “a detailed breakdown of research funding for the 2011/12 financial year” (see page 22).  While differences in success rates between academic disciplines are nothing new, this year’s figures show some really quite dramatic disparities which – in my view at least – require an explanation and action.

The overall success rate was 14% (779 applications, 108 funded) for the last tranche of responsive mode Small Grants and response mode Standard Grants (now Research Grants).  However, Business and Management researchers submitted 68 applications, of which 1 was funded.  One.  One single funded application.  In the whole year.  For the whole discipline.  Education fared little better with 2 successes out of 62.

Just pause for a moment to let that sink in.  Business and Management.  1 of 68.  Education.  2 of 62.

Others did worse still.  Nothing for Demographics (4 applications), Environmental Planning (8), Science and Technology Studies (4), Social Stats, Computing, Methods (11), and Social Work (10).  However, with a 14% success rate working out at about 1 in 7, low volumes of applications may explain this.  It’s rather harder to explain a total of 3 applications funded from 130.

Next least successful were ‘no lead discipline’ (4 of 43) and Human Geography (3 from 32).  No other subjects had success rates in single figures.  At the top end were Socio-Legal Studies (a stonking 39%, 7 of 18), and Social Anthropology (28%, 5 from 18), with Linguistics; Economics; and Economic and Social History also having hit rates over 20%.  Special mention for Psychology (185 applications, 30 funded, 16% success rate) which scored the highest number of projects – almost as many as Sociology and Economics (the second and third most funded) combined.

Is this year unusual, or is there a worrying and peculiar trend developing?  Well, you can judge for yourself from this table on page 49 of last year’s annual report, which has success rates going back to the heady days of 06/07.  Three caveats, though, before you go haring off to see your own discipline’s stats.  One is that the reports refer to financial years, not academic years, which may (but probably doesn’t) make a difference.  The second is that the figures refer to Small and Standard Grants only (not Future Leaders/First Grants, Seminar Series, or specific targeted calls).  The third is that funded projects are categorised by lead discipline only, so the figures may not tell the full story as regards involvement in interdisciplinary research.

You can pick out your own highlights, but it looks to me as if this year is only a more extreme version of trends that have been going on for a while.  Last year’s Education success rate?  5%.  The years before?  8% and 14%  Business and Management?  A heady 11%, compared to 10% and 7% for the preceding years. And you’ve got to go all the back to 9/10 to find the last time any projects were funded in Demography, Environmental Planning, or Social Work.  And Psychology has always been the most funded, and always got about twice as many projects as the second and third subjects, albeit from a proportionately large number of applications.

When I have more time I’ll try to pull all the figures together in a single spreadsheet, but at first glance many of the trends seem similar.

So what’s going on here?  Well, there are a number of possibilities.  One is that our Socio Legal Studies research in this country is tip top, and B&M research and Education research is comparatively very weak.  Certainly I’ve heard it said that B&M research tends to suffer from poor research methodologies.  Another possibility is that some academic disciplines are very collegiate and supportive in nature, and scratch each other’s backs when it comes to funding, while other disciplines are more back-stabby than back-scratchy.

But are any or all of these possibilities sufficient to explain the difference in funding rates?  I really don’t think so.  So what’s going on?  Unconscious bias?  Snobbery?  Institutional bias?  Politics?  Hidden agendas?  All of the above?  Anyone know?

More pertinently, what do we do about it?  Personally, I’d like to see the appropriate disciplinary bodies putting a bit of pressure on the ESRC for some answers, some assurances, and the production of some kind of plan for addressing the imbalance.  While no-one would expect to see equal success rates for every subject, this year’s figures – in my view – are very troubling.

And something needs to be done about it, whether that’s a re-thinking of priorities, putting the knives away, addressing real disciplinary weaknesses where they exist, ring-fenced funding, or some combination of all of the above.  Over to greater minds than mine…..