ESRC success rates 2013/2014

The ESRC Annual Report for 2013-14 has been out for quite a while now, and a quick summary and analysis from me is long overdue.

Although I was tempted to skip straight through all of the good news stories about ESRC successes and investments and dive straight in looking for success rates, I’m glad I took the time to at least skim read some of the earlier stuff.  When you’re involved in the minutiae of supporting research, it’s sometimes easy to miss the big picture of all the great stuff that’s being produced by social science researchers and supported by the ESRC.  Chapeau, everyone.

In terms of interesting policy stuff, it’s great to read that the “Urgency Grants” mechanism for rapid responses to “rare or unforeseen events” which I’ve blogged about before is being used, and has funded work “on the Philippines typhoon, UK floods, and the Syrian crisis”.  While I’ve not been involved in supporting an Urgency Grant application, it’s great to know that the mechanism is there, that it works, and that at least some projects have been funded.

The “demand management” agenda

This is what the report has to say on “demand management” – the concerted effort to reduce the number of applications submitted, so as to increase the success rates and (more importantly) reduce the wasted effort of writing and reviewing applications with little realistic chance of success.

Progress remains positive with an overall reduction in application numbers of 41 per cent, close to our target of 50 per cent. Success rates have also increased to 31 per cent, comparable with our RCUK partners. The overall quality of applications is up, whilst peer review requirements are down.

There are, however, signs that this positive momentum may
be under threat as in certain schemes application volume is
beginning to rise once again. For example, in the Research
Grants scheme the proposal count has recently exceeded
pre-demand management levels. It is critical that all HEIs
continue to build upon early successes, maintaining the
downward pressure on the submission of applications across
all schemes.

It was always likely that “demand management” might be the victim of its own success – as success rates creep up again, getting a grant appears more likely and so researchers and research managers encourage and submit more applications.  Other factors might also be involved – the stage of the REF cycle, for example.  Or perhaps now talk of researcher or institutional sanctions has faded away, there’s less incentive for restraint.

Another possibility is that some universities haven’t yet got the message or don’t think it applies to them.  It’s also not hard to imagine that the kinds of internal review mechanisms that some of us have had for years and that we’re all now supposed to have are focusing on improving the quality of applications, rather than filtering out uncompetitive ideas.  But is anyone disgracing themselves?

Looking down the list of successes by institution (p. 41) it’s hard to pick out any obvious bad behaviour.  Most of those who’ve submitted more than 10 applications have an above-average success rate.  You’d only really pick out Leeds (10 applications, none funded), Edinburgh (8/1) and Southampton (14/2), and a clutch of institutions on 5/0, (including top-funded Essex, surprisingly) but in all those cases one or two more successes would change the picture.  Similarly for the top performers – Kings College (7/3), King Leicester III (9/4), Oxford (14/6) – hard to make much of a case for the excellence or inadequacy of internal peer review systems from these figures alone.  What might be more interesting is a list of applications by institution which failed to reach the required minimum standard, but that’s not been made public to the best of my knowledge.  And of course, all these figures only refer to the response mode Standard Grant applications in the financial year (not academic year) 2013-14.

Concentration of Funding

Another interesting stat (well, true for some values of “interesting”) concerns the level of concentration of funding.  The report records the expenditure levels for the top eleven (why 11, no idea…) institutions by research expenditure and by training expenditure.  Interesting question for you… what percentage of the total expenditure do the top 11 institutions get?  I could tell you, but if I tell you without making you guess first, it’ll just confirm what you already think about concentration of funding.  So I’m only going to tell you that (unsurprisingly) training expenditure is more concentrated than research funding.  The figures you can look up for yourself.  Go on, have a guess, go and check (p. 44) and see how close you are.

Research Funding by Discipline

On page 40, and usually the most interesting/contentious.  Overall success rate was 25% – a little down from last year, but a huge improvement on 14% two years ago.

Big winners?  History (4 from 6); Linguistics (5 from 9), social anthropology (4 from 9), Political and International Studies (9 from 22), and Psychology (26 from 88, – just under 30% of all grants funded were in psychology).  Big losers?  Education (1 from 27), Human Geography (1 from 19), Management and Business Studies (2 from 22).

Has this changed much from previous years?  Well, you can read what I said last year and the year before on this, but overall it’s hard to say because we’re talking about relatively small numbers for most subjects, and because some discipline classifications have changed over the last few years.  But, once again, for the third year in a row, Business and Management and Education do very, very poorly.

Human Geography has also had a below average success rate for the last few years, but going from 1 in 19 from 3 from 14 probably isn’t that dramatic a collapse – though it’s certainly a bad year.  I always make a point of trying to be nice about Human Geography, because I suspect they know where I live.  Where all of us live.  Oh, and Psychology gets a huge slice of the overall funding, albeit not a disproportionate one given the number of applications.

Which kinds of brings us back to the same questions I asked in my most-read-ever piece – what on earth is going on with Education and Business and management research, and why do they do so badly with the ESRC?  I still don’t have an entirely satisfactory answer.

I’ve put together a table showing changes to disciplinary success rates over the last few years which I’m happy to share, but you’ll have to email me for a copy.  I’ve not uploaded it here because I need to check it again with fresh eyes before it’s used – fiddly, all those tables and numbers.

Pre-mortems: Tell me why your current grant application or research project will fail

I came across a really interesting idea the other day week via the Freakonomics podcast – the idea of a project “pre-mortem” or “prospective hindsight”  They interviewed Gary Klein who described it as follows:

KLEIN:  I need you to be in a relaxed state of mind.  So lean back in your chair. Get yourself calm and just a little bit dreamy. I don’t want any daydreaming but I just want you to be ready to be thinking about things. And I’m looking in a crystal ball. And uh, oh, gosh…the image in the crystal ball is a really ugly image. And this is a six-month effort. We are now three months into the effort and it’s clear that this project has failed. There’s no doubt about it. There’s no way that it’s going to succeed. Oh, and I’m looking at another scene a few months later, the project is over and we don’t even want to talk about it. And when we pass each other in the hall, we don’t even make eye contact. It’s that painful. OK. So this project has failed, no doubt about it [….] I want each of you to write down all the reasons why this project has failed. We know it failed. No doubts. Write down why it failed.

The thinking here is that such an approach to projects reduces overconfidence, and elsewhere the podcast discusses the problems of overconfidence, “go fever”, the Challenger shuttle disaster, and how cultural/organisational issues can make it difficult to bring up potential problems and obstacles.  The pre-mortem exercise might free people from that, and encourages people (as a team) to find reasons for failure and then respond to them.  I don’t do full justice to the arguments here, but you can listen to it for yourself (or read the transcript) at the link above.  It reminds me of some of the material covered in a MOOC I took which showed how very small changes in the way that questions are posed and framed can make surprisingly large differences to the decisions that people make, so perhaps this very subtle shift in mindset might be useful.

How might we use the idea of a pre-mortem in research development?  My first thought was about grant applications.  Would it help to get the applicants to undertake the pre-mortem exercise?  I’m not sure that overconfidence is often a huge problem among research teams (a kind of grumpy, passive-aggressive form of entitled pessimism is probably more common), so perhaps the kind of groupthink overconfidence/excessive positivity is less of an issue than in larger project teams where nobody wants to be the one to be negative.  But perhaps there’s value in asking the question anyway, and re-focusing applicants on the fact that they’re writing an application for reviewers and for a funding body, not for themselves.  A reminder that the views, priorities, and (mis)interpretations of others are crucial to their chances of success or failure.

Would it help to say to internal reviewers “assume this project wasn’t funded – tell me why”?  Possibly.  It might flush out issues that reviewers may be too polite or insufficiently assertive to raise otherwise, and again, focuses minds on the nature of the process as a competition.  It could also help reviewers identify where the biggest danger for the application lies.

Another way it could usefully be used is in helping applicants risk assess their own project.  Saying to them “you got funded, but didn’t achieve the objectives you set for yourself.  Why not?” might be a good way of identifying project risks to minimise in the management plan, or risks to alleviate through better advanced planning.  It might prompt researchers to think more cautiously about the project timescale, especially around issues that are largely out of their control.

So… has anyone used anything like this before in research development?  Might it be a useful way of thinking?  Why will your current application fail?

MOOCing about: My experience of a massively open online course

I’ve just completed my first Massively Open Online Course (or MOOC) entitled ‘The mind is flat: the shocking shallowness of human psychology run via the Futurelearn platform.  It was run by Professor Nick Chater and PhD student Jess Whittlestone of Warwick Business School and this is the second iteration of the course, which I understand will be running again at some point. Although teaching and learning in general (and MOOCs in particular) are off topic for this blog, I thought it might be interesting to jot down a few thoughts about my very limited experience of being on the receiving end of a MOOCing.  There’s been a lot of discussion of MOOCs which I’ve been following in a kind of half-hearted way, but I’ve not seen much (if anything) written from the student perspective.

“Alright dudes… I’m the future of higher education, apparently. Could be worse… could be HAL 9000”

I was going to explain my motivations for signing up for the course to add a bit of context, but one of the key themes of the MOOC has been the shallowness and instability of human reasons and motivations.  We can’t just reach back into our minds, it seems, and retrieve our thinking and decision making processes from a previous point in time.  Rather, the mind is an improviser, and can cobble together – on demand – all kinds of retrospective justifications and explanations for our actions which fit the known facts including our previous decisions and the things we like to think motivate us.

So my post-hoc rationalisation of my decision to sign up is probably three-fold. Firstly, I think a desire for lifelong learning and in particular an interest in (popular) psychology are things I ascribe to myself.  Hence an undergraduate subsidiary module in psychology and having read Stuart Sutherland’s wonderful book ‘Irrationality‘.  A second plausible explanation is that I work with behavioural economists in my current role, and this MOOC would help me understand them and their work better.  A third possibility is that I wanted to find out what MOOCs were all about and what it was like to do one, not least because of their alleged disruptive potential for higher education.

So…. what does the course consist of?  Well, it’s a six week course requiring an estimated five hours of time per week.  Each week-long chunk has a broad overarching theme, and consists of a round-up of themes arising from questions from the previous week, and then a series of short videos (generally between 4 and 20 minutes) either in a lecture/talking head format, or in an interview format.  Interviewees have included other academics and industry figures.  There are a few very short written sections to read, a few experiments to do to demonstrate some of the theories, a talking point, and finally a multiple choice test.  Students are free to participate whenever they like, but there’s a definite steer towards trying to finish each week’s activities within that week, rather than falling behind or ploughing ahead. Each video or page provides the opportunity to add comments, and it’s possible for students to “like” each other’s comments and respond to them.  In particular there’s usually one ‘question of the week’ where comment is particularly encouraged.

The structure means that it’s very easy to fit alongside work and other commitments – so far I’ve found myself watching course videos during half time in Champions League matches (though the half time analysis could have told its own story about the shallowness of human psychology and the desire to create narratives), last thing at night in lieu of bedtime reading, and when killing time between finishing work and heading off to meet friends.  The fact that the videos are short means that it’s not a case of finding an hour or more at a time for uninterrupted study. Having said that, this is a course which assumes “no special knowledge or previous experience of studying”, and I can well imagine that other MOOCs require a much greater commitment in terms of time and attention.

I’ve really enjoyed the course, and I’ve found myself actively looking forward to the start of a new week, and to carving out a free half hour to make some progress into the new material.  As a commitment-light, convenient way of learning, it’s brilliant.  The fact that it’s free helps.  Whether I’d pay for it or not I’m not sure, not least because I’ve learnt that we’re terrible at working out absolute value, as our brains are programmed to compare.  Once a market develops and gives me some options to compare, I’d be able to think about it.  Once I had a few MOOCs under my belt, I’d certainly consider paying actual money for the right course on the right topic at the right level with the right structure. At the moment it’s possible to pay for exams (about £120, or £24 for a “statement of participation”) on some courses, but as they’re not credit bearing it’s hard to imagine there would be much uptake. What might be a better option to offer is a smaller see for a self-printable .pdf record of courses completed, especially once people start racking up course completions.

One drawback is the multiple choice method of examining/testing, which doesn’t allow much sophistication or nuance in answers.  A couple of the questions on the MOOC I completed were ambiguous or poorly phrased, and one in particular made very confusing use of “I” and “you” in a scenario question, and I’d still argue (sour grapes alert) that the official “correct” answer was wrong. I can see that multiple choice is the only really viable way of having tests at the moment (though one podcast I was listening to the other day mooted the possibility of machine text analysis marking for short essays based on marks given to a sample number), but I think a lot more work needs to go into developing best (and better) practice around question setting.  It’s difficult – as a research student I remember being asked to come up with some multiple choice questions about the philosophy of John Rawls for an undergraduate exam paper, and struggled with that.  Though I did remove the one from the previous paper which asked how many principles of justice there were (answer: it depends how you count them).

But could it replace an undergraduate degree programme?  Could I imagine doing a mega-MOOC as my de facto full time job, watching video lectures, reading course notes and core materials, taking multiple choice questions and (presumably) writing essays?  I think probably not.  I think the lack of human interaction would probably drive me mad – and I say this as a confirmed introvert.  Granted, a degree level MOOC would probably have more opportunities for social interaction – skype tutorials, better comments systems, more interaction with course tutors, local networks to meet fellow students who live nearby – but I think the feeling of disconnection, isolation, and alienation would just be too strong.  Having said that, perhaps to digital natives this won’t be the case, and perhaps compared (as our brains are good at comparing) to the full university experience a significantly lighter price tag might be attractive.  And of course, for those in developing countries or unable or unwilling to relocate to a university campus (for whatever reason), it could be a serious alternative.

But I can certainly see a future that blends MOOC-style delivery with more traditional university approaches to teaching and learning.  Why not restructure lectures into shorter chunks and make them available online, at the students’ convenience?  There are real opportunities to bring in extra content with expert guest speakers, especially industry figures, world leading academic experts, and particularly gifted and engaging communicators.  It’s not hard to imagine current student portals (moodle, blackboard etc) becoming more and more MOOC-like in terms of content and interactivity.  In particular, I can imagine a future where MOOCs offer opportunities for extra credit, or for non-credit bearing courses for students to take alongside their main programme of study.  These could be career-related courses, courses that complement their ‘major’, or entirely hobby or interest based.

One thought that struck me was whether it was FE rather than HE that might be threatened by MOOCs.  Or at least the Adult Ed/evening classes aspect of FE.  But I think even there a motivation to – say – decide to learn Spanish, is only one motivation – another is often to meet new people and to learn together, and I don’t think that that’s an itch that MOOCs are entirely ready to scratch. But I can definitely see a future for MOOCs as the standard method of continuing professional development in any number of professional fields, whether these are university-led or not. This has already started to happen, with a course called ‘Discovering Business in Society‘ counting as an exemption towards one paper of an accounting qualification.  I also understand that Futurelearn are interested in pilot schemes for the use of MOOCs 16-19 year olds to support learning outcomes in schools.

It’s also a great opportunity for hobbyists and dabblers like me to try something new and pursue other intellectual interests.  I can certainly imagine a future in which huge numbers of people are undertaking a MOOC of one kind or another, with many going from MOOC to MOOC and building up quite a CV of virtual courses, whether for career reasons, personal interest, or a combination of both.  Should we see MOOCs as the next logical and interactive step from watching documentaries? Those who today watch Horizon and Timewatch and, well, most of BBC4, might in future carry that interest forward to MOOCs.

So perhaps rather than seeing MOOCs in terms of what they’re going to disrupt or displace or replace, we’re better off seeing them as something entirely new.

And I’m starting my next MOOC on Monday – Cooperation in the contemporary world: Unlocking International Politics led by Jamie Johnson of the University of Birmingham.  And there are several more that look tempting… How to read your boss from colleagues at the University of Nottingham, and England in the time of Richard III from – where else – the University of Leicester.

Adam Golberg announces new post about Ministers inserting themselves into research grant announcements

“You might very well think that as your hypothesis, but I couldn’t possibly comment”

Here’s something I’ve been wondering recently.  Is it just me, or have major research council funding announcements started to be made by government ministers, rather than by the, er, research councils?

Here’s a couple of examples that caught my eye from the last week or so. First, David Willetts MP “announces £29 million of funding for ESRC Centres and Large Grants“.  Thanks Dave!  To be fair, he is Minster of State for Universities and Science.  Rather more puzzling is George Osborne announcing “22 new Centres for Doctoral Training“, though apparently he found the money as Chancellor of the Exchequer.  Seems a bit tenuous to me.

So I had a quick look back through the ESRC and EPSRC press release archives to see if the prominence of government ministers in research council funding announcements was a new thing or not.  Because I hadn’t noticed it before.  With the ESRC, it is new.  Here’s the equivalent announcement from last year in which no government minister is mentioned.  With the EPSRC, it’s being going on for longer.  This year’s archive and the 2013 archive show government ministers (mainly Willetts, sometimes Cable or Osborne) front and centre in major announcements.  In 2012 they get a name check, but normally in the second or third paragraph, not in the headline, and don’t get a picture of themselves attached to the story.

Does any of this matter? Perhaps not, but here’s why I think it’s worth mentioning.  The Haldane Principle is generally defined as “decisions about what to spend research funds on should be made by researchers rather than politicians”.  And one of my worries is that in closely associating political figures with funding decisions, the wrong impression is given.  Read the recent ESRC announcement again, and it’s only when you get down to the ‘Notes for Editors’ section that there’s any indication that there was a competition, and you have to infer quite heavily from those notes that decisions were taken independently of government.

Why is this happening? It might be for quite benign reasons – perhaps research council PR people think (probably not unreasonably) that name-checking a government minister gives them a greater chance of media coverage. But I worry that it might be for less benign reasons related to political spin – seeking credit and basking in the reflected glory of all these new investments, which to the non-expert eye look to be something novel, rather than research council business as usual.  To be fair, there are good arguments for thinking that the current government does deserve some credit for protecting research budgets – a flat cash settlement (i.e. cut only be the rate of inflation each year) is less good than many want, but better than many feared. But it would be deeply misleading if the general public were to think that these announcements represented anything above and beyond the normal day-to-day work of the research councils.

Jo VanEvery tells me via Twitter that ministerial announcements are normal practice in Canada, but something doesn’t quite sit right with me about this, and it’s not a party political worry.  I feel there’s a real risk of appearing to politicise research.  If government claims credit, it’s reasonable for the opposition to criticise… now that might be the level of investment, but might it extend to the investments chosen?  Or do politicians know better than to go there for cheap political points?

Or should we stop worrying and just embrace it? It’s not clear that many people outside of the research ‘industry’ notice anyway (though the graphene announcement was very high profile), and so perhaps the chances of the electorate being misled (about this, at least) are fairly small.

But we could go further.  MEPs to announce Horizon 2020 funding? Perhaps Nick Clegg should announce the results of the British Academy/Leverhulme Small Grants Scheme, although given the Victorian origins of investments and wealth supporting work of the Leverhulme Trust, perhaps the honour should go to the ghosts of Gladstone or Disraeli.

Six writing habits I reckon you ought to avoid in grant applications…..

There are lots of mistakes to avoid in writing grant applications, and I’ve written a bit about some of them in some previous posts (see “advice on grant applications” link above).  This one is more about writing habits.  I read a lot of draft grant applications, and as a result I’ve got an increasingly long list of writing quirks, ticks, habits, styles and affectations that Get On My Nerves.

Imagine I’m a reviewer… Okay, I’ll start again.. imagine I’m a proper reviewer with some kind of power and influence…. imagine further that I’ve got a pile of applications to review that’s as high as a high pile of applications.  Imagine how well disposed I’d feel towards anyone who makes reading their writing easier, clearer, or in the least bit more pleasant.  Remember how the really well-written essays make your own personal marking hell a little bit less sulphurous for a short time.  That.  Whatever that tiny burst of goodwill – or antibadwill – is worth, you want it.

The passive voice is excessively used

I didn’t know the difference between active and passive voice until relatively recently, and if you’re also from a generation where grammar wasn’t really teached in schools then you might not either.  Google is your friend for a proper explanation by people who actually know what they’re talking about, and you should probably read that first, but my favourite explanation is from Rebecca Johnson – if you can add “by zombies”, then it’s passive voice. I’ve also got the beginnings of a theory that the Borg from Star Trek use the passive voice, and that’s one of the things that makes them creepy (“resistance is futile” and “you will be assimilated”)  but I don’t know enough about grammar or Star Trek to make a case for this.   Sometimes the use of the passive voice (by zombies) is appropriate, but often it makes for distant and slightly tepid writing.  Consider:

A one day workshop will be held (by zombies) at which the research findings will be disseminated (by zombies).  A recording of the event will be made (bz) and posted on our blog (bz).  Relevant professional bodies will be approached (bz)…

This will be done, that will be done.  Yawn.  Although, to be fair, a workshop with that many zombies probably won’t be a tepid affair.  But much better, I think, to take ownership… we will do these things, co-Is A and B will lead on X.  Academic writing seems to encourage depersonalisation and formality and distancing (which is why politicians love it – “mistakes were made [perhaps by zombies, but not by me]”.

I think there are three reasons why I don’t like it.  One is that it’s just dull.  A second is that I think it can read like a way of avoiding detail or specifics or responsibility for precisely the reasons that politicians use it, so it can subconsciously undermine the credibility of what’s being proposed.  The third reason is that I think for at least some kinds of projects, who the research team are – and in particular who the PI is – really matters.  I can understand the temptation to be distant and objective and sciency as if the research speaks entirely for itself.  But this is your grant application, it’s something that you ought to be excited and enthused by, and that should come across. If you’re not, don’t even bother applying.

First Person singular, First Person plural, Third Person

Pat Thomson’s blog Patter has a much fuller and better discussion about the use of  “we” and “I” in academic writing that I can’t really add much to. But I think the key thing is to be consistent – don’t be calling yourself Dr Referstoherselfinthethirdperson in one part of the application, “I” in another, “the applicant” somewhere else, and “your humble servant”/ “our man in Havana” elsewhere.  Whatever you choose will feel awkward, but choose a consistent method of awkwardness and have done with it. Oh, and don’t use “we” if you’re the sole applicant.  Unless you’re Windsor (ii), E.

And don’t use first names for female team members and surnames for male team members.  Or, worse, first names for women, titles and surnames for men. I’ve not seen this myself, but I read about it in a tweet with the hashtag #everydaysexism

Furthermore and Moreover…

Is anyone willing to mount a defence for the utility of either of these words, other than (1) general diversity of language and (2) padding out undergraduate essays to the required word count? I’m just not sure what either of these words actually means or adds, other than perhaps as an attempted rhetorical flourish, or, more likely, a way of bridging non-sequiturs or propping up poor structuring.

“However” and “Yet”…. I’ll grudgingly allow to live.  For now.

Massive (Right Justified) Wall-o-Text Few things make my heart sink more than having to read a draft application that regards the use of paragraphs and other formatting devices as illustrative of a lack of seriousness and rigour. There is a distinction between densely argued and just dense.  Please make it easier to read… and that means not using right hand justification.  Yes, it has a kind of superficial neatness, but it makes the text much less readable.

Superabundance of Polysyllabic  Terminology

Too many long words. It’s not academic language and (entirely necessary) technical terms and jargon that I particularly object to – apart from in the lay summary, of course.  It’s a general inflation of linguistic complexity – using a dozen words where one will do, never using a simple word where a complex one will do, never making your point twice when a rhetorically-pleasing triple is on offer.

I guess this is all done in an attempt to make the application or the text seem as scholarly and intellectually rigorous as possible, and I think students may make similar mistakes.  As an undergraduate I think I went through a deeply regrettable phase of trying to ape the style of academic papers in my essay writing, and probably made myself sound like one of the most pompous nineteen year olds on the planet.

If you find yourself using words like “effectuate”, you might want to think about whether you might be guilty of this.

Sta. Cca. To. Sen. Ten. Ces.

Varying and manipulating sentence length can be done deliberately to produce certain effects.  Language has a natural rhythm and pace.  Most people probably have some awareness of what that is.  They are aware that sentences which are one paced can be very dull.  They are aware that this is something tepid about this paragraph.  But not everyone can feel the music in language.  I think it is a lack of commas that is killing this paragraph.  Probably there is a technical term for this.

So… anyone willing to defend “moreover” or “furthermore”? Any particularly irritating habits I’ve missed?  Anyone actually know any grammar or linguistics provide any technical terms for any of these habits?