How useful is reading examples of successful grant applications?

This article is prompted by a couple of twitter conversations around a Times Higher Education article which quotes Ross Mounce, founding editor of Research Ideas and Outcomes, who argues for open publication at every stage of the research process, including (successful and unsuccessful) grant applications. The article acknowledges that this is likely to be controversial, but it got a few of us thinking about the value of reading other people’s grant applications to improve one’s own.

I’m asked about this a lot by prospective grant applicants – “do you have any examples of successful applications that you can share?” – and while generally I will supply them if I have access to them, I also add substantial caveats and health warnings about their use.

The first and perhaps most obvious worry is that most schemes change and evolve over time, and what works for one call might not work in another. Even if the application form hasn’t changed substantially, funder priorities – both hard priorities and softer steers – may have changed. And even if neither have changed, competitive pressures and improved grant writing skills may well be raising the bar, and an application that got funded – say – three or four years ago might not get funding today. Not necessarily because the project is weaker, but because the exposition and argument would now need to be stronger. This is particularly the case for impact – it’s hard to imagine that many of the impact sections on RCUK applications written in the early days of impact would pass muster now.

The second, and more serious worry, is that potential applicants take the successful grant application far too seriously and far too literally. I’ve seen smart, sensible, sophisticated people become obsessed with a successful grant application and try to copy everything about it, whether relevant or not, as if there was some mystical secret encoded into the text, and any subtle deviation would prevent the magic from working. Things like… the exact balance of the application, the tables/diagrams used or not used (“but the successful application didn’t have diagrams!”), the referencing system, the font choice, the level of technical detail, the choice and exposition of methods, whether there are critical friends and/or a steering group, the number of Profs on the bid, the amount of RA time, the balance between academic and stakeholder impact.

It’s a bit like a locksmith borrowing someone else’s front door key, making as exact a replica as she can, and then expecting it to open her front door too. Or a bit like taking a recipe that you’ve successfully followed and using it to make a completely different dish by changing the ingredients while keeping the cooking processes the same. Is it a bit like cargo cult thinking? Attempting to replicate an observed success or desired outcome by copying everything around it as closely as possible, without sufficient reflection on cause and effect? It’s certainly generalising inappropriately from a very small sample size (often n=1).

But I think – subject to caveats and health warnings – it can be useful to look at previously successful applications from the same scheme. I think it can sometimes even be useful to look at unsuccessful applications. I’ve changed my thinking on this quite a bit in the last few years, when I used to steer people away from them much more strongly. I think they can be useful in the following ways:

  1. Getting a sense of what’s required. It’s one thing seeing a blank application form and list of required annexes and additional documents, it’s another seeing the full beast. This will help potential applicants get a sense of the time and commitment that’s required, and make sensible, informed decisions about their workload and priorities and whether to apply or not.
  2. It also highlights all of the required sections, so no requirement of the application should come as a shock. Increasingly with the impact agenda it’s a case of getting your ducks in a row before you even think about applying, and it’s good to find that out early.
  3. It makes success feel real, and possible, especially if the grant winner is someone the applicant knows, or who works at the same institution. Low success rates can be demoralising, but it helps to know not only that someone, somewhere is successful, but that someone here and close by has been successful.
  4. It does set a benchmark in terms of the state of readiness, detail, thoroughness, and ducks-in-a-row-ness that the attentive potential applicant should aspire to at least equal, if not exceed. Early draft and early stage research applications often have larger or smaller pockets of vaguery and are often held together with a generous helping of fudge. Successful applications should show what’s needed in terms of clarity and detail, especially around methods.
  5. Writing skills. Writing grant applications is a very different skill to writing academic papers, which may go some way towards explaining why the Star Wars error in grant writing is so common. So it’s going to be useful to see examples of that skill used successfully… but having said that, I have a few examples in my library of successes which were clearly great ideas, but which were pretty mediocre as examples of how to craft a grant application.
  6. Concrete ideas and inspiration. Perhaps about how to use social media, or ways to engage stakeholders, or about data management, or other kinds of issues, questions and challenges if (and only if) they’re also relevant for the new proposal.

So on balance, I think reading (funder and scheme) relevant, recent, and highly rated (even if not successful) funding applications can help prospective applicants…. provided that they remember that what they’re reading and drawing inspiration from is a different application from a different team to do different things for different reasons at a different time.

And not a mystical, magical, alchemical formula for funding success.