So… yeah… this post is a bit more personal and a lot more off topic than usual. And yes, it is mostly a build up to a request for sponsorship. Sit tight, though… I’ll have a load more post-embargo originally appeared in Research Professional content to post over the coming weeks and months.
“So, how you are doing, Adam?” “Good news and bad news, really.” “Which do you want to tell me about first?” “Both at the same time… I don’t have cancer…. any more”
In the middle of a house move, I found a lump where there probably shouldn’t have been a lump. The following day, I see a GP who agrees… that’s a lump where there shouldn’t be a lump. The day after I’m in for a blood test… the following week I’m seeing a specialist. Blood test negative (a good sign), lump is smooth and spherical (good sign), but it’s inside the testicle (bad sign). I know I’m in good hands when my specialist answers my question about what she thinks it probably is… she says she doesn’t know. Because she doesn’t. It takes confidence to admit that. Time for more tests.
The coolest person in the hospital – the Ultrasound Guy – does know. That it’s almost certainly a tumour… lots of indications that it is, nothing to indicate that it isn’t. The ultrasound guy is so ultra-sound that he’s holding clinics on Saturdays to catch up with a backlog. Phone call with the specialist a few days later confirms it… my right testicle and I will have to undergo a conscious uncoupling. Just over two weeks later – to allow for self-isolation and a negative COVID test – I’m in for surgery.
It’s a day procedure… it’s not going to be fun, but I’ve had a more serious operation before. If I can get through that, I can get through this. This too shall pass. Every decade or so, part of my body rebels against me and an example needs to be set pour encourager les autres. So it goes. It is known. At least I’ll not forget where I was when I heard the news about the death of the Duke of Edinburgh. Recovery is… slow and complicated by a post-op infection, eventually antibiotick-ed.
Week and a half later, and I’m back in for a CT scan. This time a backlog-clearing early evening appointment, held in a clinic on the sprawling, construction-scarred and largely deserted Nottingham City Hospital campus. The clinic is behind the archetypal door marked “Beware of the Leopard“, but eventually I find someone to take pity on me and give me directions. I’m late, flustered, and embarrassed, but fortunately my lateness is in perfect synchronization with their overrunningness. Also, they’re used to people being late, flustered, and embarrassed. They’re all very lovely to me, and the scanning isn’t nearly as bad as they’d led me to believe it might be.
Two weeks later, and I’m back for the results. And breathe. It’s good news. CT scan normal, biopsy shows that the tumour was small (22mm) and hadn’t spread. I had another blood test on the day, and that came back normal too. No chemo, as the marginal benefit isn’t worth the risk. There’s a very good chance the cancer won’t return, and if it does, there’s a very good chance it’s treatable. I’ll be under observation for five years or so. To paraphrase, if I absolutely insist on getting cancer, testicular cancer is the one to get. And if I absolutely insist on getting testicular cancer, get the type of testicular cancer I had, and seek medical attention immediately.
This post has been a bit flippant and contained some black humour, which is one way of coping and of making sense of things. Truth is, this was a very worrying time. There were always back-up options – if the cancer had spread, it would very probably have been very treatable. But then again, the lump had turned out to be a tumour rather than a cyst, so the odds had already gone against me once. They could do so again.
Why am I telling you all this?
Point One. Get your weird lumps and bumps checked out. Doing so will make them real, force you to drag that background ignore-able worry into the foreground where it’s harder to ignore. But think of it as consolidating your worries into a single manageable payment. It’s probably not cancer… it’s quite unlikely to be cancer. But if you check it out, you can forget about it.
Perhaps I should have started this blog post with the story of the time where I went to see my GP about a little lump on my back. Textbook cyst, said the GP. She was right. But she said I was right to come and see her about it, and I always should. True, I’ve got (nearly) all the privilege there is going, but I’m told that my experience is pretty common. Check everything out early. It’s better for you and it’s better for the NHS. The best time to get anything checked out is early. The second best time is now.
Point Two. All hail the NHS. Eight weeks from finding a lump to a result and an action plan. Eight weeks. In the middle of a global pandemic, folks. Cost to me, nil some prescription charges. Everyone is utterly lovely to me… Dr G, the GP who got me seen quickly at the beginning and antibiotic-ed me at the end. The Consultant willing to say she didn’t know. The ultrasound guy, who broke difficult news to me when he could have left it to the consultant. The whole surgical team. Everyone on oncology. In a US-style healthcare system, I dread to think about what this would have cost. I dread to think about how the cost of future cover would have restricted my professional and personal options.
Point Three. We’ve made huge progress in cancer research. For my particular flavour of cancer the research for treatments seems to have been largely done. But we’ve all lost people to cancer, many far, far too young. I lost a member of my extended family earlier this year. Shortly after I came round from surgery, I heard that a friend had died. His funeral took place on the morning of the afternoon when I received my results. He was a genuinely superb human being on every level and by every metric and I wish I’d known him better.
So we’re not yet where we need to be with cancer research. Not close. And the pandemic has been a real kick on the… teeth… for medical research in general. Their vital fundraising has been very seriously hit. Charity shops? Shut. Mass participation events, like the London Marathon? Cancelled. Not just the London Marathon, but your local city marathon or half marathon or 10k… all those sponsored walks or Races for Life? All gone. They’re struggling to honour existing research commitments, never mind fund vital new research.
If you’re wondering whether this is a build up to me asking for sponsorship for my latest act of folly, then yes, yes it is. Although… if there’s a charity that means more to you, and if you can afford it, support them instead. Or as well as. I don’t mind – the whole charity sector is struggling.
Long before my diagnosis, I’d arranged to take on the full, continuous Peak District Challenge, along with friends from my undergraduate and postgraduate days. It’s a 100km (62 miles) walk through the Peak District. Organisers estimate that the finish time – 20 – 36 hours. Sensible people attempt this over two days. We’re not sensible.
I’ve run a marathon before. Seven. But this is a very different kind of challenge. Weirdly, I’d feel happier if I were running rather than walking. All my marathons have been over inside four hours… 100km is endurance challenge that’s in a different league to anything I’ve tried before. Also, marathons don’t tend to have the Peak District in the way.
COVID has restricted our opportunities to train and prepare, especially on the right kind of terrain. Plus, you know… I’ve not been very well. It’s going to be an uphill struggle, and that includes the parts that are downhill. Go on, chuck us a few quid. Please? The link is to my friend John’s page, as we’re pooling our fundraising efforts. If anything I’ve written/tweeted has ever been any use to you, go on.
Hello and welcome to a reflective piece written by someone in a position of relative privilege in academia during a time of collapse and crisis. Written by someone who knows no more than you do about how best to cope with or understand it, and quite possibly substantially less.
So why am I writing? Out of an attempt to take stock of where we’re at as honestly as I can, without succumbing to the twin temptations of false hope of some brighter new dawn or the consolations of cynicism.
After a little reflection, this is what I want to tell you…
Kindness is everything
I don’t know who you are but listen, you’re doing really well. You probably know the saying by now: “you’re not working from home, you’re working at home in a pandemic”. And it’s true—you’re being tested in all kinds of ways, I’m sure. It’s so easy to focus on what we feel we’re not doing well that we completely take for granted the things we are doing well. This is the basis of imposter syndrome where we think of our own talents and achievements as mundane but regard those of others as vastly superior. See also: the Dunning-Kruger Effect.
We’ve got to be kinder to ourselves, as well as to others. I’ve been carrying a bit of residual guilt around because it feels like very little of the burden of the current crisis has fallen upon my shoulders. I have been able to continue working in physical safety and—in spite of some spicy days and weeks—with a manageable workload that doesn’t pose a serious risk to my mental wellbeing. As I don’t have children, I’ve not had pressures of home schooling.
It’s good to be aware that others have it tougher, to be willing to help, to show kindness and concern and empathy and consideration. To have a sense of proportion. But it’s a mistake to minimise or even discount the things that we’re finding difficult. The things we’re missing.
The fact that other people are in much, much more pain than I am doesn’t mean it hurts any less when I stub my toe. And I won’t make my toe feel any better by berating myself for being in pain and for not wanting to be in pain.
It’s easy to focus on those from whom extraordinary efforts are required during these extraordinary times, to compare ourselves to that extraordinary standard, and judge ourselves harshly. But if you’re anything like me, by this stage you’ve probably normalised a lot of the restrictions that all of us are asked to live under. Not seeing family and friends, severely curtailed leisure activities, having to adapt to remote working and so on. It’s all so [makes screaming sound] and this is the new normal.
But we should not forget that we’re all contributing. If you’re following whatever the guidelines are today, you’re contributing. I have always been ill-suited for a healthcare career due to my squeamishness and clumsiness, so perhaps I should not compare my contribution to theirs. A lot is being asked of each and every one of us even if you feel – as I do – your burden is lighter. From each according to their ability, and so on.
Won’t get fooled again
Kindness—for others, for ourselves—should be the order of the day but what is stopping it becoming the order of the everyday?
There’s a temptation to think that things must be different, will have to be better after this crisis. We should be aware that powerful forces will want to put things back more or less where they were before it ever happened (see the last financial crisis) or in even crueler positions (ibid). I’ve listened to a fewpodcasts discussing the post-1945 political settlement in the UK and the birth of the welfare state, and it’s clear is that none of that happened by accident or overnight. A lot of work went into preparing the ground and preparing the arguments and policy solutions.
If we want to “build back better” (sorry) in academia, we need to think creatively, we need to share ideas, we need to prepare the ground for radical ideas. We need to shift the Overton Window.
For one thing, we can’t do better in academia without confronting our structural inequalities. And I am sorry. Yes, this is another white, middle-age, middle-class, heterosexual, cisgender man telling everyone what he thinks about equality issues. I understand the scepticism. But in my defence there’s only one thing worse than all that: someone who is all those things and yet doesn’t think about equality issues.
Over the summer I listened to a Hidden Brain podcast on ‘Playing Favourites’ which includes a story about a Yale academic who received markedly better treatment for a hand injury once the doctors discovered that she worked at Yale. And a story about an academic who agreed to an interview she would usually decline just because the journalist had been at the same university at the same time. Both the doctor and the academic could come away from their respective interactions feeling a warm glow as they’d both done something nice for someone else that they didn’t have to.
But as the academic in that second story—Mahzarin Banaji—said: “I think that kind of act of helping towards people with whom we have some shared group identity is really the modern way in which discrimination likely happens.”
Which leaves me to ask: who gets my standard service, and who gets my above-and-beyond, my extra mile? Who gets one last extra read of their proposal? Who gets a meeting rather than an email? Who gets a longer meeting? Whose request gets the quickest response? This year my challenge to myself is (a) to keep an eye on who find I want to do favours for; and (b) look to do more favours for members of disadvantaged/unrepresented groups who may not have had their share of favours in the past. I invite you to join me. My preliminary conclusion is that I tend to privilege the pushy because I’m a people pleaser. I should do better.
My one piece of advice
I’ve only got one bit of proper, real advice for researchers and research professionals and it has got nothing to do with research or academia and it is, I am sorry, only relevant to those privileged enough not to be shielding. Go for a walk outside. Or a run, or a cycle. If you can, you should. You won’t regret it. I seldom regret going for a run, and I never regret going for a walk. Around the park, around the block, whatever. Listen to nature or the streetscape, or put in your headphones, listen to your happy tunes at top volume or your favourite podcast, and stride purposefully like you’re five minutes late for a meeting on the other side of campus.
You may or may not feel better afterwards. But at least you’ll have been for a walk.
You’re applying for UK research council funding and suddenly you’re confronted with massive overhead costs. Adam Golberg tries to explain what you need to know.
Trying to explain Full Economic Costing is not straightforward. For
current purposes, I’ll be assuming that you’re an academic applying for UK
Research Council funding; that you want to know enough to understand your
budget; and that you don’t really want to know much more than that.
If you do already know a lot about costing or research finances, be warned – this article contains simplifications, generalisations, and omissions, and you may not like it.
What are Full Economic Costs, and why are they taking up so much of my budget?
Full Economic Costs (fEC) are paid as part of UK Research and Innovation grants to cover a fair share of the wider costs of running the university – the infrastructure that supports your research. There are a few different cost categories, but you don’t need to worry about the distinctions.
Every UK university calculates its own overhead rates using a common methodology. I’m not going to try to explain how this works, because (a) I don’t know; and (b) you don’t need to know. Most other research funders (charities, EU funders, industry) do not pay fEC for most of their schemes. However, qualifying peer-reviewed charity funding does attract a hidden overhead of around 19% through QR funding (the same source as REF funding). But it’s so well hidden that a lot of people don’t know about it. And that’s not important right now.
How does fEC work?
In effect, this methodology produces a flat daily overhead rate to be charged
relative to academic time on your project. This rate is the same for the time
of the most senior professor and the earliest of early career researchers.
One effect of this is to make postdoc researchers seem proportionally more expensive. Senior academics are more expensive because of higher employment costs (salary etc), but the overheads generated by both will be the same. Don’t be surprised if the overheads generated by a full time researcher are greater than her employment costs.
All fEC costs are calculated at today’s rates. Inflation and increments
will be added later to the final award value.
Do we have to charge fEC overheads?
Yes. This is a methodology that all universities use to make sure that
research is funded properly, and there are good arguments for not undercutting
each other. Rest assured that everyone – including your competitors– are
playing by the same rules and end up with broadly comparable rates. Reviewers
are not going to be shocked by your overhead costs compared to rival bids. Your
university is not shooting itself (or you) in the foot.
There are fairness reasons not to waive overheads. The point of Research
Councils is to fund the best individual research proposals regardless of the
university they come from, while the REF (through QR) funds for broad,
sustained research excellence based on historical performance. If we start
waiving overheads, wealthier universities will have an unfair advantage as they
can waive while others drown.
Further, the budget allocations set by funders are decided with fEC overheads in mind. They’re expecting overhead costs. If your project is too expensive for the call, the problem is with your proposal, not with overheads. Either it contains activities that shouldn’t be there, or there’s a problem with the scope and scale of what you propose.
However, there are (major) funding calls where “evidence of institutional
commitment” is expected. This could include a waiver of some overheads, but
more likely it will be contributions in kind – some free academic staff time, a
PhD studentship, new facilities, a separate funding stream for related work.
Different universities have different policies on co-funding and it probably
won’t hurt to ask. But ask early (because approval is likely to be complex) and
have an idea of what you want.
What’s this 80% business?
This is where things get unnecessarily complicated. Costs are calculated
at 100% fEC but paid by the research councils at 80%. This leaves the remaining
20% of costs to be covered by the university. Fortunately, there’s enough money
from overheads to cover the missing 20% of direct costs. However, if you have a
lot of non-pay costs and relatively
little academic staff time, check with your costings team that the project is
Why 80%? In around 2005 it was deemed ‘affordable’ – a compromise figure intended to make a significant contribution to university costs but without breaking the bank. Again, you don’t need to worry about any of this.
Can I game the fEC system, and if so, how?
Academic time is what drives overheads, so reducing academic time reduces
overheads. One way to do this is to think about whether you really need as much
researcher time on the project. If you really need to save money, could
contracts finish earlier or start later in the project?
Note that non-academic time (project administrators, managers,
technicians) does not attract overheads, and so are good value for money under
this system. If some of the tasks you’d like your research associate to do are
project management/administration tasks, your budget will go further if you
cost in administrative time instead.
However, if your final application has unrealistically low amounts of academic time and/or costs in administrators to do researcher roles, the panel will conclude that either (a) you don’t understand the resource implications of your own proposal; or (b) a lack of resources means the project risks being unable to achieve its stated aims. Either way, it won’t be funded. Funding panels are especially alert for ‘salami projects’ which include lots of individual co-investigators for thin slivers of time in which the programme of research cannot possibly be completed. Or for undercooked projects which put too much of a burden on not enough postdoc researcher time. As mentioned earlier, if the project is too big for the call budget, the problem is with your project.
The best way to game fEC it is not to worry about it. If you have support with your research costings, you’ll be working with someone who can cost your application and advise you on where and how it can be tweaked and what costs are eligible. That’s their job – leave it to them, trust what they tell you, and use the time saved to write the rest of the application.
Nathaniel Golden (Nottingham Trent) and Jonathan Hollands (University of
Nottingham) for invaluable comments on earlier versions of this article. Any
errors that remain are my own.
The relentless drive for research excellence has created a culture in modern science that cares exclusively about what is achieved and not about how it is achieved.
As I speak to people at every stage of a scientific career, although I hear stories of wonderful support and mentorship, I’m also hearing more and more about the troubling impact of prevailing culture.
People tell me about instances of destructive hyper-competition, toxic power dynamics and poor leadership behaviour – leading to a corresponding deterioration in researchers’ wellbeing. We need to cultivate, reward, and encourage the best while challenging what is wrong.
We know that Wellcome has helped to create this focus on excellence. Our aim has rightly been to support research with the potential to benefit society. But I believe that we now also have an important role to play in changing and improving the prevailing research culture. A culture in which, however unintentionally, it can be hard to be kind.
If we want science to be firing on all cylinders, we need everyone in the research system – individuals, institutions and funders – working in step to foster a positive working culture.
Which leads me to wonder what the role of research development and other research support professionals should be in moving towards a more positive research culture. I don’t know the answer, and this post is an open invitation to share your thoughts. I’ll pull these together into a crowd-sourced post with credit for those who want it and anonymity for those who don’t. This approach seemed to work well for a previous post around supporting a new academic discipline, so perhaps it will work here too.
I don’t want to say too much in this post, but as I’m asking others I should at least share a few indicative thoughts about areas to think about.
We should look at our own profession, our own culture, and how we treat each other. In my time in research development I’ve generally found it to be a supportive profession, both internally within the universities where I’ve worked, and (especially) externally through ARMA. However, I’m white, male, heterosexual, middle age, middle class, so I’m very much playing on ‘easy mode‘. I don’t get mistaken for an administrator, and either I’m super diplomatic and great at influencing and persuading, or I get taken more seriously by some people because of my jackpot of categories of privilege. As I’ve alluded to on this blog before, I do have a slight stammer and have written about the challenges that can cause me, but it that has seldom held me back and I don’t think it’s affected how I’m perceived.
In terms of our own profession and our own behaviour, the phrase “be the change you want to see in the world” came to mind. Although… when I went to google to find out who said it, I found an interesting blog post that arguing that Mahatma Gandhi (to whom it is usually attributed) said and meant something rather more different and much more challenging. It’s not simply about living our values, but reflecting on them and changing ourselves where necessary. As a philosopher by training I also thought about Aristotle and his writings on the importance of character and virtues – if you nurture the right character and the right virtues, the chances that you’ll respond in the right way when tested or under pressure will be higher. But how do we do that? Practice, reflection, courage, and learning from the example of others, both positive and negative.
Less esoterically, a second category of issues is around our role in supporting research and researchers, especially around grant getting and grant writing activity. Competition for funding, low success rates, increasingly long and complicated application forms, and pressure from university management form part of research culture. While we rarely have formal power or authority over academic staff, we do have a measure of influence on research culture.So how do we use that influence and our roles for good? What’s our role in preventing research excellence coming at the expense of those who make it happen – which includes us, in our small way. I’ll kick things off with three issues I’ve been thinking about recently…
Firstly, forwarding funding opportunities and supporting applications. When I send funding opportunities onto academics, am I guilty of unconscious bias? Am I committing the availability error and just emailing the first people who come to mind? Does that mean some people with certain characteristics are more likely to receive those emails than others? Does unconscious bias affect how I respond to tentative enquiries about opportunities, or about how I divide my time between proposals?
Honest answer is that I don’t know. But I’ve been influenced by the pushback against ‘manels’ (all male panels at conferences)… and if my funding opportunity distribution list looks like a manel, especially a white manel (because intersectionality is key) I’m taking time to stop and think about who I might have missed. Sometimes structural inequalities or call specifics mean that I got it right first time, but it’s worth a check.
Secondly, what’s our role around workload and work life balance? Could we do more to minimise the burden on researchers at all levels of seniority? Partly this is around efficiency and systems and processes, but partly I think there are cultural issues to consider too. I recently had a discussion with organisers of a research network which ran funding calls about the appropriateness of having a deadline of (something like) 23:59 on Sunday evening. The argument was that academics preferred this because it gave them more time than, say a Friday 4:00pm deadline. But it’s time over a weekend, and arguably this increases the expectation that academics work weekends. When do we set our internal deadlines for various tasks, from REF reviews to internal peer reviews to internal deadlines for draft applications? Do we assume that academic colleagues will be working weekends?
Thirdly, when we advise on the staffing of research projects, are we creating good jobs with fair salaries and training career development opportunities? The issue of ‘good jobs’ on research projects (for academics and managers/administrators) was something that Wellcome brought up at a visit I attended a few weeks ago. I have to admit that under cost pressures on UKRI applications, there’s a strong incentive to try to cut researcher time as much as possible to reduce both employment costs and overheads. Of course, we should never over-cost for any post for any funder, but likely I’ve had a role in creating (potential) jobs that are lower quality than they might otherwise be.
That’s probably enough for now – this was supposed to be a short post. But this is an open invitation to email me with any thoughts you have about challenges we face, or steps we might take, in responding to the Wellcome Trust’s challenge to reimagine how we do research. I’ll be sharing this invitation via the ARMA Research Development email list and via Twitter for greater international reach.
I’ve recently moved from a role supporting the Business School and the School of Economics to a central role at the University of Nottingham, looking after our engagement with research charities. I’m going from a role where I know a few corners of the university very well to a role where I’m going to have to get to know more about much more of it.
My academic background (such as it is) is in political
philosophy and for most of my research development career I’ve been supporting
(broadly) social sciences, with a few outliers. I’m now trying to develop my
understanding of academic disciplines that I have little background or
experience in – medical research, life sciences, physics, biochemistry etc. I
suspect the answer is just time, practice, familiarity, confidence (and
Wikipedia), but I found myself wondering if there are any short cuts or
particularly good resources to speed things up.
Fortunately, if you’re a member of ARMA, you’re never on your own, and I sent an email around the Research Development Special Interest Group email list, with a promise (a) to write up contributions as a blog post and (b) to add some hints and tips of my own, especially for the social sciences.
So here goes… the collated and collected wisdom of the SIG… bookmark this post and revisit it if your remit changes…
Don’t panic… and focus on what you can do
In my original email, the first requirement I suggested was ‘time’, and that’s been echoed in a lot of the responses. “Time, practice, familiarity, confidence (and Wikipedia)” as Chris Hewson puts it. It’s easy to be overwhelmed by a sea of new faces and names and an alphabet soup of new acronyms- and to regard other people’s hard-won institutional/school/faculty knowledge as some kind of magical superpower.
Lorna Wilson suggests that disciplinary differences are overrated and “sometimes the narrative of ‘difference’ is what makes things harder. The skills and expertise we have as research development professionals are transferable across the board, and I think that the silos of disciplines led to a silo-ing of roles (especially in larger universities). With the changes in the external landscape and push with more challenge-led interdisciplinary projects, the silos of disciplines AND of roles I think is eroding.”
But there are differences in practices and norms – there are differences in terminology, outlook, career structures, internal politics, norms, and budget sizes – and I’m working hard trying not to carry social science assumptions with me. Though perhaps I’m equally likely to be too hesitant to generalise from social science experience where it would be entirely appropriate to do so.
Rommany Jenkins has “moved from Arts and Humanities to Life Sciences” and thinks that while “the perception might be that it’s the harder direction to go in because of the complexity of the subject matter […] it’s probably easier because the culture is quite straightforward […] although there are differences between translational / clinical and basic, the principles of the PI lab and team are basically the same”. She thinks that perhaps “it’s more of a culture shock moving into Arts and Humanities, because people are all so independently minded and come at things from so many different directions and don’t fit neatly into the funding boxes. […] I know a lot of people just find it totally bizarre that you can ask a Prof in Arts what they need in terms of costings and they genuinely don’t know.”
Charlotte Johnson moved in the opposite direction, from science to arts. “The shortcut was trying to find commonalities in how the different disciplines think and prepare their research. Once you realise that an artist and a chemist would go about planning their research project very similarly, and they only start to diverge in the experimental/interpretation stage, it does actually make it all quite easy to understand“
Muriel Swijghuisen Reigersberg says that her contribution “tends to be not so much on the science front, but on the social and economic or policy and political implications of the work STEMM colleagues are doing and recommendations around impact and engagement or even interdisciplinary angles to enquiries for larger projects.”
My colleague Liz Humphreys makes a similar (and very reassuring) point about using the same “skills to assess any bid by not focusing on the technical things but focus on all the other usual things that a bid writer can strengthen”. A lay summary that doesn’t make any lay sense is an issue regardless of discipline, as is a summary that doesn’t summarise that’s more of an introduction. Getting good at reviewing research grants can transcend academic disciplines. “If someone can’t explain to me what they’re doing,” says Claire Edwards, “then it’s unlikely to convince reviewers or a panel.”
Kate Clift make a similar point: “When I am working in a discipline which is alien to me I tend to try and ground the proposed research in something which I do understand so I can appreciate the bigger picture, context etc. I will ask lots of ‘W’ questions – Why is it important? What do you want to do? Who is going to do it? Less illuminating to me in this situations is HOW they are going to do it”.
Roger Singleton Escofet makes the very sensible point that some subjects are very theoretical “where you will always struggle to understand what is being proposed”. I certainly found this with Economics – I could hope to try to understand what a proposed project did, but how it worked would always be beyond me. Reminds me a bit of this Armstrong and Miller sketch in which they demonstrate how not to do public engagement in theoretical physics.
Ann Onymous-Contributor says that “multidisciplinary projects are the best way to ease yourself into other disciplines and their own specific languages. My background is in social sciences but because of the projects I have worked on I have experience of, and familiarity with a range of arts and hard science disciplines and the languages they use. Broad, shallow knowledge accumulated on this basis can be very useful; sometimes specific disciplinary knowledge is less important than understanding connections between different disciplines, or the application of knowledge, which typically also tend to be the things which specialists miss.” I think this is a really good point – if we allow ourselves it include the other disciplines that we’ve supported as part of interdisciplinary bids, we may find we’ve more experience that we thought.
Finding the Shallow End, Producing your Cheat Sheet
Lorna Wilson suggests “[h]aving a basic understanding” of methodologies in different disciplines, “helps to demonstrate how [research questions] are answered and hypotheses evidenced, and I think breaks through some of the ‘difference’. What makes things slightly more difficult is also accessibility, in terms of language of disciplines, we could almost do with a cheat sheet in terms of terms!”
Richard Smith suggests identifying academics in the field who are effective and willing communicators “who appreciate the benefits and know the means of conveying approaches and fields to non-experts… and do it with enthusiasm”. Harry Moriarty’s experience has been that often ECRs and PhD students are a particularly good source – many are more willing to engage, and perhaps have more to benefit from our advice and support.
Muriel Swijghuisen Reigersberg suggests attending public
lectures (rather than expert seminars) which will be aimed at the generalist,
and notes that expert-novice conversations will benefit the academic expert in
terms of practising explanations of complex topics to a generalist audience. I
think we can all recognise academics who enjoy talking about their work to
non-specialists and with a gift for explanations, and those who don’t, haven’t
Other non-academic colleagues can help too, Richard argues – especially impact and public or business engagement staff working in that area, but also admin staff and School managers. Sanja Vlaisavljevic wanted to “understand how our various departments operate, not just in terms of subject-matter but the internal politics”. This is surely right – I’m sure we’re all aware of historical disagreements or clashes between powerful individuals or whole research groups/Schools that stand in the way of certain kinds of collaboration or joint working. Whether we work to try to erode these obstructions or navigate deftly around them, we need to know that they’re there.
Caroline Moss-Gibbons adds librarians to the list, citing their resource guides and access/role with the university repository. Claire Edwards observes that many research development staff have particular academic backgrounds that might be useful.
Don’t try to fake it till you make it
“Be open that you’re new to the area, but if they’re looking for funding they need to be able to explain their research to a non-specialist” says Jeremy Barraud.
I’ve always found that a full, frank, and even cheerful confession of a lack of knowledge is very effective. I often include a blank slide in presentations to illustrate what I don’t know. My experience is that admitting what I don’t know earns me a better hearing on matters that I do know about (as long as I do both together), but I’m aware that as a straight, white, middle aged, middle class male perhaps that’s easier for me to do. I’ve suspected for some time now that being male (and therefore less likely to be mistaken for an “administrator”) means I’m probably playing research development on easy mode. There’s an interesting project around EDI and research development that I’m probably not best placed to do.
While no-one is arguing for outright deception, I’ve heard
it argued that frank admissions of ignorance about a particular topic area may
make it harder to engage academic colleagues and to find out more. If academic
colleagues make certain assumptions about background, perhaps try to live up to
those with a bit of background reading. It’s easy to be written off and written
out, which then makes it harder to learn later.
I always think half the battle is convincing academic colleagues that we’re on their side and the side of their research (rather than, say, motivated by university income targets or an easier life), and perhaps it’s easy to underestimate the importance of showing an interest and a willingness to learn. Asking intelligent, informed, interested lay questions of an expert – alongside demonstrating our own expertise in grant writing etc – is one way to build relationships. My own experience with my MPhil is that research can be a lonely business, and so an outsider showing interest and enthusiasm – rather than their eyes glazing over and disengaging – can be really heartening.
Kate Clift makes an important point about combining any admissions of relative ignorance with a stress on what she can do/does know/can contribute. “I’m always very upfront with people and say I don’t have an understanding of their research but I do understand how to craft a submission – that way everyone plays to their strengths. I can focus on structure and language and the academic can focus on scientific content.”
Find a niche, get involved, be visible
For Jeremy Barraud, that was being secretary for an ethics committee. In my early days with Economics, it was supporting the production of the newsletter and writing research summaries – even though it wasn’t technically part of my remit, it was a great way to get my name known, get to know people, and have a go at summarising Economics working papers.
Suzannah Laver is a research development manager in a Medical School, but has a background in project management and strategy rather than medicine or science. For her it was “just time” and getting involved “[a]ttending the PI meetings, away days, seminars, and arranging pitching events or networking events.” Mary Caspillo-Brewer adds project inception meetings and dissemination events to the list, and also suggests attending academic seminars and technical meetings (as does Roger Singleton Escofet), even if they’re aimed at academics. This is great in terms of visibility and in terms of evidence of commitment – sending a message that we’re interested and committed, even if we don’t always entirely understand.
Mark Smith suggests visiting research labs or clinics, however terrifying they may first appear. So far I’ve only met academics in their offices – I’m not sure I trust myself anywhere near a lab. I’m still half-convinced I’ll knock over the wrong rack of test tubes and trigger a zombie epidemic. But lab visits are perhaps something I could do more of in the future when I know people better. And as Mark says, taking an interest is key.
Do your homework
I’ve blogged before about the problems with the uses and abuses of successful applications, but Nat Golden is definitely onto something when he suggests reading successful applications to look at good practice and what the particular requirements of a funder are. Oh, and reading the guidance notes.
Roger Singleton Escofet (and others) have mentioned that the Royal Society and Royal Academy of Engineering produce useful reports that “may be technical but offer good overviews on topical issues across disciplines. Funders such as research councils or Wellcome may also be useful sources since funders tend to follow (or set) the emerging areas.” Hilary Noone also suggests looking to the funders for guidance – trying to “understand the funders real meaning (crucial for new programmes and calls where they themselves are not clear on what they are trying to achieve)”.
There’s a series of short ‘Bluffer’s Guide’ books which are somewhat dated, but potentially very useful. Bluff your way in Philosophy was on my undergraduate reading list. Bluff your way in Economics gave me an excellent grounding when my role changed, and explained (among many other things) the difference between exogenous and endogenous factors. When supporting a Geography application, I learned the difference between pluvial and fluvial flooding. These little things make a difference, and it’s probably the absence of that kind of basic ground for many disciplines that I’m now supporting that’s making me feel uneasy. In a good way.
Harry Moriarty argues that it’s more complicated than just reading Wikipedia – the work he supported “was necessarily at the cutting edge and considerably beyond the level that I could get to in a sensible order – I had to take the work and climb back through the Wikipedia pages in layers, and then, once I had some underpinning knowledge, go back through the same pages in light of my new understanding”.
Specific things to do
“Become an NIHR Public Reviewer”, says Jeremy Barraud. “It’s easy to sign up and they’re keen to get more reviewers. Being on the other side of the funding fence gives a real insight into how decisions are reached (and bolsters your professional reputation when speaking with researchers). “
I absolutely second this – I’ve been reviewing for NIHR for some time and just finished a four year term as a patient/public representative on a RfPB panel. I’d recommend doing this not just to gain experience of new research areas, but as a valuable public service that you as a research development professional can perform. If you’ve got experience of a health condition, using NHS services (as a patient or carer), and you’re not a healthcare professional or researcher, I’m sure they’d love to hear from you.
Being a research participant, argues Jeremy Barraud, is “professionally insightful and personally fulfilling. The more experience you have on research in all its different angles, the better your professional standing”. This is also something I’ve done – in many ways it’s hard not to get involved in research if you’re hanging around a university. I’m part of a study looking at running and knee problems, and I’ve recently been invited to participate in another study.
Bonhi Bhattacharya registered for a MOOC (Massively Open Online Courses) – an “Introduction to Ecology” – Bonhi is a mathematician by training – “and it was immensely helpful in getting a grounding in the subject, as well as a useful primer in terminology.“ It can be a bit of a time commitment, but they’re also fascinating – and as above, really shows willing. I wrote about my experience with a MOOC on behavioural economics in a post a few years ago. Bonhi also suggests reading academics’ papers – even if only the introduction and conclusion.
Subscribe to The Conversation, says Claire Edwards, it’s “a great source of academic content aimed at a non-specialist audience”. In a similar vein, Helen Walker recommends the Wellcome-funded website Mosaic which is “great for stories that give the bigger picture ‘around’ science/research – sometimes research journeys, sometimes stories showing the broader context of science-related research.” Both Mosaic and The Conversation have podcast companions. Recent Conversation podcast series have looked at the Indian elections and moon exploration.
I’m a huge fan of podcasts, and there are loads that can help with gaining a basic understanding of new academic areas – in addition to being interesting (and sometimes amusing).
A quick search of the BBC has identified four science podcasts I should think about listening to – The Science Hour, Discovery, and BBC Inside Science. Very open to other suggestions – please tweet me or let me know in the comments/via email.
A huge thank you to all contributors:
I’m very grateful to everyone for their comments. I’ve not been able to include everything everyone said, in the interests of avoiding duplication/repetition and in the interests of keeping this post to a manageable length.
I don’t think there’s any great secret to success in supporting a new discipline or working in research development in a new institution – it’s really a case of remembering and repeating the steps that worked last time. And hopefully this blog post will serve as a reminder to others, as it is doing to me.
Jeremy Barraud is Deputy Director, Research Management and Administration, at the University of the Arts, London.
Bonhi Bhattacharya is Research Development Manager at the University of Reading
Mary Caspillo-Brewer is Research Coordinator at the Institute for Global Health, University College London
Kate Clift is Research Development Manager at Loughborough University
Anne Onymous-Contributor is something or other at the University of Redacted
Claire Edwards is Research Bid Development Manager at the University of Surrey.
Adam Forristal Golberg is Research Development Manager (Charities), at the University of Nottingham
Nathanial Golden is Research Development Manager (ADHSS) at Nottingham Trent University
Chris Hewson is Social Science Research Impact Manager at the University of York
Liz Humphreys is Research Development Manager for Life Sciences, University of Nottingham
Rommany Jenkins is Research Development Manager for Medical and Dental Sciences, University of Birmingham.
Charlotte Johnson is Senior Research Development Manager, University of Reading
Suzannah Laver is Research Development Manager at the University of Exeter Medical School
Harry Moriarty is Research Accelerator Project Manager at the University of Nottingham.
Caroline Moss-Gibbons is Parasol Librarian at the University of Gibraltar.
Hilary Noone is Project Officer (REF Environment and NUCoREs0, at the University of Newcastle
Roger Singleton Escofet is Research Strategy and Development Manager for the Faculty of Science, University of Warwick.
Mark Smith is Programme Manager – The Bloomsbury SET, at the Royal Veterinary College
Richard Smith is Research and Innovation Funding Manager, Faculty of Arts, Humanities and Social sciences, Anglia Ruskin University.
Muriel Swijghuisen Reigersberg is Researcher Development Manager (Strategy) at the University of Sydney.
Sanja Vlaisavljevic is Enterprise Officer at Goldsmiths, University of London
Helen Walker is Research and Innovation Officer at the University of Portsmouth
Lorna Wilson is Head of Research Development, Durham University
I’m writing this in the final week of my current role as Research Development Manager (Social Sciences) at the University of Nottingham before I move to my role as Research Development Manager (Research Charities) at the University of Nottingham. This may or may not change the focus of this blog, but I won’t abandon the social sciences entirely – not least because I’m stuck with the web address.
I’ve been thinking about strategies and approaches to research funding, and the place and prioritisation of applying for research grants in academic structures. It’s good for institutions to be ambitious in terms of their grant getting activities. However, these ambitions need to be at least on a nodding acquaintance with: (a) the actual amount of research funding historically available to any given particular discipline; and (b) the chances of any given unit or school or individual to compete successfully for that funding given the strength of the competition.
To use a football analogy, if I want my team to get promotion, I should moderate my expectations in the light of how many promotion places are available, and how strong the likely competition for those limited spots will be. In both cases, we want to set targets that are challenging, stretching, and ambitious, but which are also realistic and informed by the evidence.
How do we do that? Well, in a social science context, a good place to start is the ESRC success rates, and other disciplines could do worse than take a similar approach with their most relevant funding council. The ESRC produce quite a lot of data and analysis on funding and success rates, and Alex Hulkes of the ESRC Insights team writes semi-regular blog posts. Given the effort put into creating and curating this information, it seems only right that we use it to inform our strategies. This level of transparency is a huge (and very welcome) change from previous practices of very limited information being rather hidden away. Obvious caveats – the ESRC is by no means the only funder in town for the social sciences, but they’re got the deepest pockets and offer the best financial terms. Another (and probably better) way would be to compare HESA research income stats, but let’s stick to the ESRC for now.
The table below shows the running three year total (2015/6- 2017/18) and number of applications for each discipline for all calls, and the total for the period 2011/12 to 2017/8. You can access the data for yourself on the ESRC web page. This data is linked as ‘Application and success rate data (2011-12 to 2017-18)’ and was published in ODS format in May 2018. For ease of reading I’ve hidden the results from individual years.
Lots of caveats here. Unsuccessful outline proposals aren’t included (as no outline application leads directly to funding), but ‘office rejects’ (often for eligibility reasons) are. The ‘core discipline’ of each application is taken into account – secondary disciplines are not. The latest figures here are from 2017-2018 (financial year), so there’s a bit of a lag – in particular, the influence of the Global Challenges Research Fund (GCRF) or Industrial Strategy Challenge Fund (ISCF) will not be fully reflected in these figures. I think the ‘all data’ figures may include now-defunct schemes such as the ESRC Seminar Series, though I think Small Grants had largely gone by the start of the period covered by these figures.
Perhaps most importantly, because these are the results for all schemes, they include targeted calls which will rarely open to all disciplines equally. Fortunately, the ESRC also publishes similar figures for their open call (Standard) Research Grants scheme for the same time period. Note that (as far as I can tell) the data above includes the data below, just as the ‘all data’ column (which goes back to 2011/2) also includes the three year total.
This table is important because the Research Grants Scheme is bottom-up, open-call, and open to any application that’s at least 50% social sciences. Any social science researcher could apply to this scheme, whereas directed calls will inevitably appeal only to a subset. These are the chances/success rates for those whose work does not fit squarely into a directed scheme and could arguably be regarded as a more accurate measure of disciplinary success rates. It’s worth noting that a specific call that’s very friendly to a particular discipline is likely to boost the successes but may decrease the disciplinary success rate if it attracts a lot of bids. It’s also possible that major targetted calls that are friendly to a particular disciplin may result in fewer bids to open call.
To be fair, there are a few other regular ESRC schemes that are similarly open and should arguably be included if we wanted to look at the balance of disciplines and what a discipline target might look like. The New Investigator Scheme is open in terms of academic discipline, if not in time-since-PhD, and the Open Research Area call is open in terms of discipline if not in terms of collaborators. The Secondary Data Analysis Initiative is similarly open in terms of discipline, if not in terms of methods. Either way, we don’t have (or I can’t find) data which combines those schemes into a non-directed total.
Nevertheless, caveats and qualifications aside, I think these two tables give us a good sense of the size of prize available for each discipline. There’s approxinately 29 per year (of which 5 open call) for Economics, and 11 per year (of which 2 open call) for Business and Management. Armed with that information and a knowledge of the relative strength of the discipline/school in our own institution, we ought to get a sense of what a realistic target might look like and a sense of how well we’re already doing. Given what we know about our expertise, eminence, and environment, and the figures for funded projects, what ought our share of those projects be?
We could ask a further question about how those successes are distributed between universities and about any correllation between successes and (unofficial) subject league tables from the last REF, calculated on the basis of Grade Point Average or Research power. However, even if that data were available, we’d be looking at small numbers. We do know that the ESRC have done a lot of work on looking at funding distribution and concentration and their key findings are that:
ESRC peer review processes do not concentrate funding to a degree greater than that apparent in the proposals that request the funding.
ROs which apply infrequently appear to have lower success rates than do those which are more active applicants
In other words, most universities typically have comparable succcess rates except that those that apply more often do a little better than average, those who apply rarely do a little worse. This sounds intuitively right – those who apply more are likely more research-active, at least in the social sciences, and therefore more likely to generate stronger applications. But this is at an overall level, not discipline level.
I’d also note that we shouldn’t only measure success by the number of projects we lead. As grants get larger on average, there’s more research income available for co-investigators on bids leds elsewhere. I think a strategy that focuses only on leading bids and being lead institution neglects the opportunties offered by being involved in strong bids led by world class researchers based elsewhere. I’m sure it’s not unusual for co-I research income to exceed PI income for academic units.
I’ve not made any comment about the different success rates for different disciplines. I’ve written about this already for many of the years covered by the full data (though Alex Hulkes has done this far more effectively over the last few years, having the benefit of actual data skills) and I don’t really want to cover old ground again. The same disparities continue much as before. Perhaps GCRF will provide a much-needed boost for Education research (or at least the international aspects) and ISCF for management and business research.
All funding schemes have a summary section as an essential part of the application form. On the UK research councils’ JeS form, the instruction is to “[d]escribe the proposed research in simple terms in a way that could be publicised to a general audience”.
But the summary is not just about publicising your research. The summary also:
primes your reader’s expectations and understanding of your project
helps the funder to identify suitable experts to review your application
gives the reviewer a clear, straightforward, and complete overview of your project
helps your nominated funding panel introducer to summarise your application for the rest of the decision-making panel
acts as a prompt for the other panel members to recall your application, which they may have only skim-read
can be used once your project is successful, for a variety of purposes including ethics review; participant recruitment; impact work.
There are three main ways to make a mess of your summary.
1. Concentrating on the context – writing an introduction, not a summary.
I’ve written before about using too much background or introductory material (what I describe as ‘the Star Wars error’) but it’s a particular problem for a summary. The reader needs some context and background, but if it’s more than a sentence or two, it’s probably too much. If you don’t reach the “this research will” tipping point by a third of the way through (or worse, even later), there’s too much background.
2. Writing to avoid spoilers – writing a blurb not a precis
I really admire film editors who produce movie trailers: they capture the essence of the film while minimising spoilers. However while a film trailer for The Sixth Sense, The Usual Suspects, or The Crying Game should omit certain key elements, a project summary needs to include all of them. An unexpected fifth-act twist is great for film fans, but not for reviewers. Their reaction to your dramatic twist of adding a hitherto unheralded extra research question or work package is more likely to earn you bafflement and a poor review than an Oscar.
3. Ignoring Plain English
The National Institute for Health Research’s Research for Patient Benefit form asks for a “Plain English summary of research”. As a former regional panel member, I have read many applications and some great examples of Plain English summaries of very complex projects. I have read applications from teams that have not tried at all, and from those whose commitment to Plain English lasts the first three paragraphs, before they lapse back.
Writing in Plain English is hard. It involves finding a way to forget how you usually communicate to colleagues, and putting yourself in the situation of someone who knows a fraction of what you do. Without dumbing down or patronising. If it wasn’t hard to write in Plain English, we wouldn’t need the expert shorthand of specialist language, which is usually created to simplify complicated concepts to facilitate clear and concise communication among colleagues.
Incidentally, with NIHR schemes, beware not only of using specialist language, but also using higher-reading-level vocabulary and expressions when more simple ones will do. There’s no need for a superabundance of polysyllabic terminonlogy. The Levehulme Trust offers some useful guidance on writing for the lay ‘lay reader’.
When to write it
Should you write the summary when you start the application, or when you’ve finished it? Ideally both.
You should sketch out your summary when you start writing – if you can’t produce a bullet point summary of your project you’re probably not ready to write it up as an application. Save plenty of time at the very end to rework it in the light of your completed project. Above all, get as much outside input on your summary as you can. It is the most important part of the application, and well worth your time and trouble.
You are the academic expert, in the process of applying for funding to make a major advance in your field. I am not. I am a Research Development Manager – perhaps I have a PhD or MPhil in a cognate or entirely different field, or nothing postgraduate at all. How can I possibly help you?
The answer lies in this difference of experience and perspective. Sure, we may look at the same things, but different levels of knowledge,understanding – as well as different background assumptions – mean we find very different meanings in them. We all look at the world through lenses tinted by our own experiences and expectations, and if we didn’t, we couldn’t make sense of it.
Interpreting funding calls
When academics look at funding calls, they notice and emphasise the elements of the call that suit their agenda and often downplay or fail to notice other elements. Early in my career I was baffled as to why a very senior professor thought that a funding call was appropriate for a project. He’s smarter than me, more experienced…so obviously I assumed I’d got it wrong. I went back to the call expecting to find my mistake and find that his interpretation was correct. But no…my instinct was right.
Since then I’ve regularly had these conversations and pointed out that an idea would need crowbarring to death to fit a particular call, and even then would be uncompetitive. I’ve had to point out basic eligibility problems that have escaped the finely-honed research skills of frightening bright people.
When research development professionals like me look at a funding call, we see it through tinted glasses too, but these are tinted by comparable calls that we’ve seen before. We see what has unusual or disproportionate emphasis or lack of emphasis, or even the significance of what’s missing. We know what we’re looking for and whereabouts in the call we expect to see it. Our reading of calls is enhanced by a deep knowledge of the funder and its priorities, and what might be the motivation or source of funding behind a particular call.
Of course, some academics have an excellent understanding of particular funders. Especially if they’ve received funding from them, or served on a panel or as a referee, or been invited to a scoping workshop to inform the design and remit of a funding call.
But if you’re not in that position, the chances are that your friendly neighbourhood research development professional can advise you on how to interpret any given funder or scheme, or put you in touch with someone who can.
We can help you identify the most appropriate scheme and call for what you want to do, and just as importantly, prevent you from wasting your time on bids that are a poor fit. Often the best thing I do on any given day is talking someone out of spending weeks writing an application that never had any realistic chance of success.
Reading draft applications
You must have internal expert peer review and encourage your academic colleagues to be brave enough to criticise your ideas and point out weaknesses in their iteration. Don’t be Gollum.
Research Development professionals can’t usually offer expert review, but a form of lay review can be just as useful. We may not be experts in your area, but we’ve seen lots and lots of grant applications, good and bad. We have a sense of what works. We know when the balance is wrong. We know when we don’t understand sections that we think we should be able to understand, such as the lay summary. We notice when the significance or unique contribution is not clearly spelled out. We know when the methods are asserted, rather than defended. We know when sections are vague or undercooked, or fudged. Or inconsistent. When research questions appear, disappear, or mutate during the course of an application.
When I meet with academics and they explain their project, I often find there’s a mismatch between what I understood from reading a draft proposal and what they actually meant. It’s very common for only 75%-90% of an idea to be on the page. The rest will be in the mind of the applicant, who will think the missing elements are present in the document because they can’t help but read the draft through the lens of their complete idea.
If your research development colleagues misunderstand or misread your application, it may be because they lack the background, but it’s more likely that what you’ve written isn’t clear enough. There’s a lot to be learned from creative misinterpretation.
None of this is a criticism of academics; it’s true for everyone. We all see our own writing through the prism of what we intend to write, not what we’ve actually written. It’s why this article would be even worse without Research Professional’s editorial team.
Last month Back in February, I was delighted to be invited to give the keynote talk at the University of Turku’s inaugural Funding Friday event. Before the invitation I didn’t know very much about Finland (other than the joke that in Finland, an extrovert is someone who stares at your shoes) and still less about the Finnish research funding environment. But I presumed (largely, if not entirely correctly) that there are a great many issues in common, and that advice about writing grant applications would be reasonably universal.
When I reached the venue I was slightly surprised to see early arrivals each sitting at their own individual one-person desk. For a moment I did wonder if the Finnish stereotype was true to the extent that even sharing a desk was regarded as excessively extrovert. However, there was a more obvious explanation – it was exam season and the room doubled as an exam hall.
I was very impressed with the Funding Friday event. I was surprised to realise that I’d never been to a university-wide event on research funding – rather, we’ve tended to organise on a Faculty or School basis. The structure of the event was a brief introduction, my presentation (Applying for Research Funding: Preparations, Proposals, and Post-Mortems) followed by a panel discussion with five UTU academics who served on funding panels. Maria guided the panel through a series of questions about their experiences – how they ended up on a funding panel, what they’d learnt, what they looked for in a proposal, and what really annoyed them – and took questions from the floor. This was a really valuable exercise, and something that I’d like to repeat at Nottingham. I’m always trying to humanise reviewers and panel members in the minds of grant applicants and to help them understand the processes of review and evaluation, and having a range of panel members from across academic disciplines willing to share their experiences was fascinating. Of course, not everyone agreed on everything, but there seemed to be relative uniformity across panels and academic disciplines in terms of what panel members wanted to see, what made their jobs easier, and what irritated them and made things harder.
In the afternoon, we had a series of shorter sessions from UTU’s research funding specialists. Lauri spoke about applying Aristotle’s teachings on rhetoric (ethos, pathos, and logos) to structuring research grant proposals – a really interesting approach that I’d not come across before. What is a grant application if not an attempt to persuade, and what’s rhetoric if not the art of persuasion? Anu talked about funding opportunities relative to career stage, and Johanna discussed the impact agenda, and it was particularly fascinating to hear how that’s viewed in Finland, given its growth and prominence in the UK. From discussions in the room there are clearly worries about the balance between funding for ‘blue skies’ or basic research and for applied research with impact potential. Finally, we heard from Samira, a successful grant winner, about her experiences of applying for funding. It’s great to hear from successful applicants to show that success is possible in spite of dispiriting success rates.
To resubmit, or not to resubmit, that is the question
I’d arrived with the assumption that research – like almost everything else in the Nordic social democracies – would be significantly better funded pro rata than in the UK. (See, for example, the existence of an affordable, reliable railway system with play areas for small children on intercity trains). However, success rates are broadly comparable. One significant difference between the UK and Finland funding landscapes is the prevalence of the UK ‘demand management’ agenda. This limits – or even bans – the re-submission of unsuccessful applications, or imposes individual or institutional sanctions/limits on numbers and timing of future applications. The motivating force behind this is to reduce the burden of peer review and assessment, both on funders and on academic reviewers and panel members. Many UK funders, especially the ESRC, felt that a lot of the applications they were receiving were of poor quality and stood little chance of funding.
Finnish funders take an approach that’s more like the European Research Council or the Marie Curie Fellowship, where resubmissions are not only allowed but often seem to be a part of the process. Apply, be unsuccessful, get some feedback, respond to it, improve the application, and get funded second or a subsequent time round. However, one problem – as our panel of panel members discussed – is that panel membership varies from year to year, and the panel who almost nearly funded your proposal one year is not going to be the same panel who reviews the improved version the following year. For this reason, we probably shouldn’t always expect absolute consistency from panels between years, especially as the application will be up against a different set of rival bids. Also, the feedback may not contain the reasons why an application wasn’t funded nor instructions on how to make it fundable next time. Sometimes panels will point out the flaws in applications, but can be reluctant to say what needs to be said – that no version of this application, however polished, will ever be competitive. I’ve written previously about deciding whether to resubmit or not, although it was written with the UK context in mind.
The room was very much split on whether or not those receiving the lowest marks should be prevented from applying again for a time, or even about a more modest limitation on applying again with a similar project. Of course, what the UK system does is move the burden of peer review back to universities, who are often poorly placed to review their own applications as almost all their expertise will be named on the bid. But I also worry about a completely open resubmission policy if it’s not accompanied by rigorous feedback, making it clear not only how an application can be improved, but on how competitive even the best possible iteration of that idea would be.
One of the themes to emerge from the day was about when to resubmit and when to move on. Funding (and paper, and job) rejection is a fact of academic life, calling for more than a measure of determination, resilience, bouncebackability, (or as they say in Finland) sisu . But carried too far, it ends up turning into stubbornness, especially if the same unsuccessful application is submitted over and over again with little or no changes. I think most people would accept that there is an element of luck in getting research funding – I’ve seen for myself how one negative comment can prompt others, leading to a criticism spiral which sinks an initially well-received application. Sometimes – by chance – there’s one person on the panel who is a particular subject expert and really likes/really hates a particular proposal and swings the discussion in a way that wouldn’t have happened without their presence. But the existence of an element of luck does not mean that research funding is lottery in which all you need do is keep buying your ticket until your number comes up. Luck is involved, but only regarding which competitive applications are funded.
I’ve written a couple of posts before (part one, and part two) about what to do when your grant application is unsuccessful, and they might form the beginnings of a strategy to respond and to decide what to do next. At the very least, I think a thorough review of the application and any feedback offered is in order before making any decisions. I think my sense is that in any system where resubmissions are an accepted feature, and where it’s common for resubmissions to be successful, it would a shame to give up after the first attempt. By the twelfth, though…
Watching your language
I was fascinated to learn that responsibility for training in research grant application writing is shared between UTU’s research development team and their English language unit. National funders tend to give the option of writing in English or in Finnish, though writing in English makes it easier to find international referees and reviewers for grant applications – and indeed one of my Business School colleagues is a regular reviewer.
One issue I’m going to continue to think is about support for researchers writing grant applications in their second or additional language. English language support is an obvious service to offer for a university in a country whose own language is not commonly spoken beyond the borders of immediate neighbours, and particularly in Finland where the language isn’t part of the same Indo-European language group as most of the rest of Europe. But it’s not something we think about much in the UK.
I’d say about half of the researchers I support speak English as a second language, and some of the support I provide can be around proof reading and sense-making – expressing ideas clearly and eliminating errors that obscure meaning or which might irritate the reader. I tend to think that reviewers will understand some minor mistakes or awkward phrasing in English provided that the application does not contain lazy or careless errors. If a reviewer is to take the time reading it, she wants to see that the applicant has taken his time writing it.
I think most universities run courses on academic English, though I suspect most of them are designed for students. Could we do more for academic staff who want to improve their academic English- not just for grant writing, but for teaching and for the purposes of writing journal papers? And could we (and should we) normalise that support as part of professional development? Or do we just assume that immersion in an English-speaking country will be sufficient?
However… I do think that academics writing in their second language have one potential advantage. I’ve written elsewhere about the ‘Superabundance of Polysyllabic Terminology’ (aka too many long words) error in grant writing, to which native English speakers are more prone. Second language academics tend to write more clearly, more simply, and more directly. Over-complicated language can be confusing and/or annoying for a native English speaker reviewing your work, but there’s a decent chance that reviewers and panel members might speak English as a second language, who will be even more irritated. One piece of advice I once heard for writing EU grant applications was to write as if your application was going to reviewed by someone reading it in the fourth language while waiting to catch their flight. Because it might well be.
It was a real honour to visit Turku, and I’d have loved to have stayed longer. While there’s a noticeable quietness and a reserve about Finnish people – even compared to the UK – everyone I met couldn’t have been more welcoming and friendly. So, to Soile, Lauri, Anu, Johanna, Jeremy, Samira, the Turku hotel receptionist who told me how to pronounce sisu, everyone else I met, and especially to Maria for organising …. kiittos, everyone.
A version of this article first appeared in Funding Insight in November 2018 and is reproduced with kind permission of Research Professional. For more articles like this, visit www.researchprofessional.com
Given the ever-expanding requirements of most research funding application forms, it’s inevitable that applicants are tempted to pay less attention to some sections and end up writing text so generic, so bland, that it could be cut and pasted – with minimal editing of names and topics – into almost any other proposal.
Resist that temptation. Using text that looks like it could be cut and pasted between proposals suggests that you haven’t thought through the specifics of your project or fellowship, and it will make it seem less plausible as a result.
I often see responses that are so content free they make my heart sink. For example:
1) “We will present the findings at major international conferences and publish in world class journals”
2) “The findings will be of interest to researchers in A, B, and C.”
3) “This is a methodologically innovative, timely, and original project which represents a step change in our understanding”
4) “We will set up a project Twitter account and a blog, and with the support of our outstanding press office, write about our research for a general audience.”
5) “Funding will enable me to lead my own project for the first time, and support me in making the transition to independent researcher”.
These claims might well be true and can read well in isolation. But they’re only superficially plausible, and while they contain buzzwords that applicants think that funders are after, they’re entirely content, evidence, and argument free.
Why should you care? Because your proposal doesn’t just have to be good enough to meet a certain standard, it has to be better than its rivals. If there are sections of your application that could be transferred into any rival application, this might be a sign that that section is not as strong or distinctive as it could be and is not giving you any competitive edge.
Cut and paste sections may be actively harming your chances. They may read well in isolation but when compared directly to more thoughtful and more detailed sections in rival applications, they can look weak and lazy, especially if they don’t take full advantage of the word count.
Cut and pasteable text tends to occur in the trickier sections of the application form to write and those that get less attention: dissemination; impact pathway/plan; academic impact; personal development plan; data management plan; choice of host institution. Sometimes these generic statements emerge because the applicants don’t know what to write, and sometimes because it’s all they can be bothered to write for a section they wrongly regard of lesser importance.
Give these sections the time, attention and thought they deserve. Add details. Add specifics. Add argument. Add evidence. Find things to say that only apply to your application. If you don’t know how to answer a question strongly, get advice from your research development colleagues.
The more editing it would take to put it into someone else’s bid, the better. Here are some thoughts on improving the earlier examples:
1) “We will present the findings at major international conferences and publish in world class journals”. I find it hard to understand vagueness about plans for academic impact. Even allowing for the fact that the findings of the research will affect plans, it’s surely not too much to expect some target journals and conferences to be named. If applicants can’t demonstrate knowledge of realistic targets, it undermines their credibility.
2) “The findings will be of interest to researchers in A, B, and C.” I’d ban the phrase “of interest to” when explaining potential academic impact. It tells the reader nothing about the likely academic impact – who will cite your work, and what difference do you anticipate it will make to the field?
3) “This is a methodologically innovative, timely, and original project which represents a step change in our understanding” Who will use your methods? Who will use your frameworks? If all research is standing on the shoulders of giants, how much further can future researchers see perched atop your work? How exactly does your project go beyond the state of the art, and what might be the new state of the art after your project?
4) “We will set up a project Twitter account and a blog, and with the support of our outstanding press office, write about our research for a general audience.” If you’re talking about engaging with social media, talk about how you are going to find readers and/or followers. What’s your plan for your presence in terms of the existing ecosystem of social media accounts that are active in this area? Who are the current key influencers?
5) “Funding will enable me to lead my own project for the first time, and support me in making the transition to independent researcher”. How does funding take you to what’s next? What’s the path from the conclusions of this project to your future research agenda?
Looking for cut and paste text – and improving it where you find it – is an excellent review technique to polish your draft application, and particularly to improve those harder-to-write sections. Hammering out the detail is more difficult, but it could give you an advantage in the race for funding.