Research Development – supporting new academic disciplines

I’ve recently moved from a role supporting the Business School and the School of Economics to a central role at the University of Nottingham, looking after our engagement with research charities. I’m going from a role where I know a few corners of the university very well to a role where I’m going to have to get to know more about much more of it.

“Don’t panic!”

My academic background (such as it is) is in political philosophy and for most of my research development career I’ve been supporting (broadly) social sciences, with a few outliers. I’m now trying to develop my understanding of academic disciplines that I have little background or experience in – medical research, life sciences, physics, biochemistry etc. I suspect the answer is just time, practice, familiarity, confidence (and Wikipedia), but I found myself wondering if there are any short cuts or particularly good resources to speed things up.

Fortunately, if you’re a member of ARMA, you’re never on your own, and I sent an email around the Research Development Special Interest Group email list, with a promise (a) to write up contributions as a blog post and (b) to add some hints and tips of my own, especially for the social sciences.

So here goes… the collated and collected wisdom of the SIG… bookmark this post and revisit it if your remit changes…

Don’t panic… and focus on what you can do

In my original email, the first requirement I suggested was ‘time’, and that’s been echoed in a lot of the responses. “Time, practice, familiarity, confidence (and Wikipedia)” as Chris Hewson puts it. It’s easy to be overwhelmed by a sea of new faces and names and an alphabet soup of new acronyms- and to regard other people’s hard-won institutional/school/faculty knowledge as some kind of magical superpower.

Lorna Wilson suggests that disciplinary differences are overrated and “sometimes the narrative of ‘difference’ is what makes things harder. The skills and expertise we have as research development professionals are transferable across the board, and I think that the silos of disciplines led to a silo-ing of roles (especially in larger universities). With the changes in the external landscape and push with more challenge-led interdisciplinary projects, the silos of disciplines AND of roles I think is eroding.”

But there are differences in practices and norms – there are differences in terminology, outlook, career structures, internal politics, norms, and budget sizes – and I’m working hard trying not to carry social science assumptions with me. Though perhaps I’m equally likely to be too hesitant to generalise from social science experience where it would be entirely appropriate to do so.

Rommany Jenkins has “moved from Arts and Humanities to Life Sciences” and thinks that while “the perception might be that it’s the harder direction to go in because of the complexity of the subject matter […] it’s probably easier because the culture is quite straightforward […] although there are differences between translational / clinical and basic, the principles of the PI lab and team are basically the same”. She thinks that perhaps “it’s more of a culture shock moving into Arts and Humanities, because people are all so independently minded and come at things from so many different directions and don’t fit neatly into the funding boxes. […] I know a lot of people just find it totally bizarre that you can ask a Prof in Arts what they need in terms of costings and they genuinely don’t know.”

Charlotte Johnson moved in the opposite direction, from science to arts. “The shortcut was trying to find commonalities in how the different disciplines think and prepare their research.  Once you realise that an artist and a chemist would go about planning their research project very similarly, and they only start to diverge in the experimental/interpretation stage, it does actually make it all quite easy to understand“

Muriel Swijghuisen Reigersberg says that her contribution “tends to be not so much on the science front, but on the social and economic or policy and political implications of the work STEMM colleagues are doing and recommendations around impact and engagement or even interdisciplinary angles to enquiries for larger projects.”

My colleague Liz Humphreys makes a similar (and very reassuring) point about using the same “skills to assess any bid by not focusing on the technical things but focus on all the other usual things that a bid writer can strengthen”. A lay summary that doesn’t make any lay sense is an issue regardless of discipline, as is a summary that doesn’t summarise that’s more of an introduction. Getting good at reviewing research grants can transcend academic disciplines. “If someone can’t explain to me what they’re doing,” says Claire Edwards, “then it’s unlikely to convince reviewers or a panel.”

Kate Clift make a similar point: “When I am working in a discipline which is alien to me I tend to try and ground the proposed research in something which I do understand so I can appreciate the bigger picture, context etc. I will ask lots of ‘W’ questions – Why is it important? What do you want to do? Who is going to do it? Less illuminating to me in this situations is HOW they are going to do it”.

Roger Singleton Escofet makes the very sensible point that some subjects are very theoretical “where you will always struggle to understand what is being proposed”. I certainly found this with Economics – I could hope to try to understand what a proposed project did, but how it worked would always be beyond me. Reminds me a bit of this Armstrong and Miller sketch in which they demonstrate how not to do public engagement in theoretical physics.

Ann Onymous-Contributor says that “multidisciplinary projects are the best way to ease yourself into other disciplines and their own specific languages.  My background is in social sciences but because of the projects I have worked on I have experience of, and familiarity with a range of arts and hard science disciplines and the languages they use.  Broad, shallow knowledge accumulated on this basis can be very useful; sometimes specific disciplinary knowledge is less important than understanding connections between different disciplines, or the application of knowledge, which typically also tend to be the things which specialists miss.”  I think this is a really good point – if we allow ourselves it include the other disciplines that we’ve supported as part of interdisciplinary bids, we may find we’ve more experience that we thought.

Finding the Shallow End, Producing your Cheat Sheet

Lorna Wilson suggests “[h]aving a basic understanding” of methodologies in different disciplines, “helps to demonstrate how [research questions] are answered and hypotheses evidenced, and I think breaks through some of the ‘difference’. What makes things slightly more difficult is also accessibility, in terms of language of disciplines, we could almost do with a cheat sheet in terms of terms!”

Richard Smith suggests identifying academics in the field who are effective and willing communicators “who appreciate the benefits and know the means of conveying approaches and fields to non-experts… and do it with enthusiasm”. Harry Moriarty’s experience has been that often ECRs and PhD students are a particularly good source – many are more willing to engage, and perhaps have more to benefit from our advice and support.

Muriel Swijghuisen Reigersberg suggests attending public lectures (rather than expert seminars) which will be aimed at the generalist, and notes that expert-novice conversations will benefit the academic expert in terms of practising explanations of complex topics to a generalist audience. I think we can all recognise academics who enjoy talking about their work to non-specialists and with a gift for explanations, and those who don’t, haven’t or both.

Other non-academic colleagues can help too, Richard argues – especially impact and public or business engagement staff working in that area, but also admin staff and School managers. Sanja Vlaisavljevic wanted to “understand how our various departments operate, not just in terms of subject-matter but the internal politics”. This is surely right – I’m sure we’re all aware of historical disagreements or clashes between powerful individuals or whole research groups/Schools that stand in the way of certain kinds of collaboration or joint working. Whether we work to try to erode these obstructions or navigate deftly around them, we need to know that they’re there.

Caroline Moss-Gibbons adds librarians to the list, citing their resource guides and access/role with the university repository. Claire Edwards observes that many research development staff have particular academic backgrounds that might be useful.

Don’t try to fake it till you make it

“Be open that you’re new to the area, but if they’re looking for funding they need to be able to explain their research to a non-specialist” says Jeremy Barraud.

I’ve always found that a full, frank, and even cheerful confession of a lack of knowledge is very effective. I often include a blank slide in presentations to illustrate what I don’t know. My experience is that admitting what I don’t know earns me a better hearing on matters that I do know about (as long as I do both together), but I’m aware that as a straight, white, middle aged, middle class male perhaps that’s easier for me to do. I’ve suspected for some time now that being male (and therefore less likely to be mistaken for an “administrator”) means I’m probably playing research development on easy mode. There’s an interesting project around EDI and research development that I’m probably not best placed to do.

While no-one is arguing for outright deception, I’ve heard it argued that frank admissions of ignorance about a particular topic area may make it harder to engage academic colleagues and to find out more. If academic colleagues make certain assumptions about background, perhaps try to live up to those with a bit of background reading. It’s easy to be written off and written out, which then makes it harder to learn later.

I always think half the battle is convincing academic colleagues that we’re on their side and the side of their research (rather than, say, motivated by university income targets or an easier life), and perhaps it’s easy to underestimate the importance of showing an interest and a willingness to learn. Asking intelligent, informed, interested lay questions of an expert – alongside demonstrating our own expertise in grant writing etc – is one way to build relationships. My own experience with my MPhil is that research can be a lonely business, and so an outsider showing interest and enthusiasm – rather than their eyes glazing over and disengaging – can be really heartening.

Kate Clift makes an important point about combining any admissions of relative ignorance with a stress on what she can do/does know/can contribute. “I’m always very upfront with people and say I don’t have an understanding of their research but I do understand how to craft a submission – that way everyone plays to their strengths. I can focus on structure and language and the academic can focus on scientific content.”

Find a niche, get involved, be visible

For Jeremy Barraud, that was being secretary for an ethics committee. In my early days with Economics, it was supporting the production of the newsletter and writing research summaries – even though it wasn’t technically part of my remit, it was a great way to get my name known, get to know people, and have a go at summarising Economics working papers.

Suzannah Laver is a research development manager in a Medical School, but has a background in project management and strategy rather than medicine or science. For her it was “just time” and getting involved “[a]ttending the PI meetings, away days, seminars, and arranging pitching events or networking events.” Mary Caspillo-Brewer adds project inception meetings and dissemination events to the list, and also suggests attending academic seminars and technical meetings (as does Roger Singleton Escofet), even if they’re aimed at academics. This is great in terms of visibility and in terms of evidence of commitment – sending a message that we’re interested and committed, even if we don’t always entirely understand.

Mark Smith suggests visiting research labs or clinics, however terrifying they may first appear. So far I’ve only met academics in their offices – I’m not sure I trust myself anywhere near a lab. I’m still half-convinced I’ll knock over the wrong rack of test tubes and trigger a zombie epidemic. But lab visits are perhaps something I could do more of in the future when I know people better. And as Mark says, taking an interest is key.

Do your homework

I’ve blogged before about the problems with the uses and abuses of successful applications, but Nat Golden is definitely onto something when he suggests reading successful applications to look at good practice and what the particular requirements of a funder are. Oh, and reading the guidance notes.

Roger Singleton Escofet (and others) have mentioned that the Royal Society and Royal Academy of Engineering produce useful reports that “may be technical but offer good overviews on topical issues across disciplines. Funders such as research councils or Wellcome may also be useful sources since funders tend to follow (or set) the emerging areas.” Hilary Noone also suggests looking to the funders for guidance – trying to “understand the funders real meaning (crucial for new programmes and calls where they themselves are not clear on what they are trying to achieve)”.

There’s a series of short ‘Bluffer’s Guide’ books which are somewhat dated, but potentially very useful. Bluff your way in Philosophy was on my undergraduate reading list. Bluff your way in Economics gave me an excellent grounding when my role changed, and explained (among many other things) the difference between exogenous and endogenous factors. When supporting a Geography application, I learned the difference between pluvial and fluvial flooding. These little things make a difference, and it’s probably the absence of that kind of basic ground for many disciplines that I’m now supporting that’s making me feel uneasy. In a good way.

Harry Moriarty argues that it’s more complicated than just reading Wikipedia – the work he supported “was necessarily at the cutting edge and considerably beyond the level that I could get to in a sensible order – I had to take the work and climb back through the Wikipedia pages in layers, and then, once I had some underpinning knowledge, go back through the same pages in light of my new understanding”.

Specific things to do

“Become an NIHR Public Reviewer”, says Jeremy Barraud. “It’s easy to sign up and they’re keen to get more reviewers. Being on the other side of the funding fence gives a real insight into how decisions are reached (and bolsters your professional reputation when speaking with researchers). “

I absolutely second this – I’ve been reviewing for NIHR for some time and just finished a four year term as a patient/public representative on a RfPB panel. I’d recommend doing this not just to gain experience of new research areas, but as a valuable public service that you as a research development professional can perform. If you’ve got experience of a health condition, using NHS services (as a patient or carer), and you’re not a healthcare professional or researcher, I’m sure they’d love to hear from you.

Being a research participant, argues Jeremy Barraud, is “professionally insightful and personally fulfilling. The more experience you have on research in all its different angles, the better your professional standing”. This is also something I’ve done – in many ways it’s hard not to get involved in research if you’re hanging around a university. I’m part of a study looking at running and knee problems, and I’ve recently been invited to participate in another study.

Bonhi Bhattacharya registered for a MOOC (Massively Open Online Courses) – an “Introduction to Ecology” – Bonhi is a mathematician by training – “and it was immensely helpful in getting a grounding in the subject, as well as a useful primer in terminology.“ It can be a bit of a time commitment, but they’re also fascinating – and as above, really shows willing. I wrote about my experience with a MOOC on behavioural economics in a post a few years ago. Bonhi also suggests reading academics’ papers – even if only the introduction and conclusion.

Resources

Subscribe to The Conversation, says Claire Edwards, it’s “a great source of academic content aimed at a non-specialist audience”. In a similar vein, Helen Walker recommends the Wellcome-funded website Mosaic which is “great for stories that give the bigger picture ‘around’ science/research – sometimes research journeys, sometimes stories showing the broader context of science-related research.” Both Mosaic and The Conversation have podcast companions. Recent Conversation podcast series have looked at the Indian elections and moon exploration.

I’m a huge fan of podcasts, and there are loads that can help with gaining a basic understanding of new academic areas – in addition to being interesting (and sometimes amusing).

A quick search of the BBC has identified four science podcasts I should think about listening to – The Science Hour, Discovery, and BBC Inside Science. Very open to other suggestions – please tweet me or let me know in the comments/via email.

A huge thank you to all contributors:

I’m very grateful to everyone for their comments. I’ve not been able to include everything everyone said, in the interests of avoiding duplication/repetition and in the interests of keeping this post to a manageable length.

I don’t think there’s any great secret to success in supporting a new discipline or working in research development in a new institution – it’s really a case of remembering and repeating the steps that worked last time. And hopefully this blog post will serve as a reminder to others, as it is doing to me.

  • Jeremy Barraud is Deputy Director, Research Management and Administration, at the University of the Arts, London.
  • Bonhi Bhattacharya is Research Development Manager at the University of Reading
  • Mary Caspillo-Brewer is Research Coordinator at the Institute for Global Health, University College London
  • Kate Clift is Research Development Manager at Loughborough University
  • Anne Onymous-Contributor is something or other at the University of Redacted
  • Claire Edwards is Research Bid Development Manager at the University of Surrey.
  • Adam Forristal Golberg is Research Development Manager (Charities), at the University of Nottingham
  • Nathanial Golden is Research Development Manager (ADHSS) at Nottingham Trent University
  • Chris Hewson is Social Science Research Impact Manager at the University of York
  • Liz Humphreys is Research Development Manager for Life Sciences, University of Nottingham
  • Rommany Jenkins is Research Development Manager for Medical and Dental Sciences, University of Birmingham.
  • Charlotte Johnson is Senior Research Development Manager, University of Reading
  • Suzannah Laver is Research Development Manager at the University of Exeter Medical School
  • Harry Moriarty is Research Accelerator Project Manager at the University of Nottingham.
  • Caroline Moss-Gibbons is Parasol Librarian at the University of Gibraltar.
  • Hilary Noone is Project Officer (REF Environment and NUCoREs0, at the University of Newcastle
  • Roger Singleton Escofet is Research Strategy and Development Manager for the Faculty of Science,  University of Warwick.
  • Mark Smith is Programme Manager – The Bloomsbury SET, at the Royal Veterinary College
  • Richard Smith is Research and Innovation Funding Manager, Faculty of Arts, Humanities and Social sciences, Anglia Ruskin University.
  • Muriel Swijghuisen Reigersberg is Researcher Development Manager (Strategy) at the University of Sydney.
  • Sanja Vlaisavljevic is Enterprise Officer at Goldsmiths, University of London
  • Helen Walker is Research and Innovation Officer at the University of Portsmouth
  • Lorna Wilson is Head of Research Development, Durham University

Setting Grant Getting Targets in the Social Sciences

I’m writing this in the final week of my current role as Research Development Manager (Social Sciences) at the University of Nottingham before I move to my role as Research Development Manager (Research Charities) at the University of Nottingham. This may or may not change the focus of this blog, but I won’t abandon the social sciences entirely – not least because I’m stuck with the web address.

Image by Tookapic

I’ve been thinking about strategies and approaches to research funding, and the place and prioritisation of applying for research grants in academic structures. It’s good for institutions to be ambitious in terms of their grant getting activities. However, these ambitions need to be at least on a nodding acquaintance with:
(a) the actual amount of research funding historically available to any given particular discipline; and
(b) the chances of any given unit or school or individual to compete successfully for that funding given the strength of the competition.

To use a football analogy, if I want my team to get promotion, I should moderate my expectations in the light of how many promotion places are available, and how strong the likely competition for those limited spots will be. In both cases, we want to set targets that are challenging, stretching, and ambitious, but which are also realistic and informed by the evidence.

How do we do that? Well, in a social science context, a good place to start is the ESRC success rates, and other disciplines could do worse than take a similar approach with their most relevant funding council. The ESRC produce quite a lot of data and analysis on funding and success rates, and Alex Hulkes of the ESRC Insights team writes semi-regular blog posts. Given the effort put into creating and curating this information, it seems only right that we use it to inform our strategies. This level of transparency is a huge (and very welcome) change from previous practices of very limited information being rather hidden away. Obvious caveats – the ESRC is by no means the only funder in town for the social sciences, but they’re got the deepest pockets and offer the best financial terms. Another (and probably better) way would be to compare HESA research income stats, but let’s stick to the ESRC for now.

The table below shows the running three year total (2015/6- 2017/18) and number of applications for each discipline for all calls, and the total for the period 2011/12 to 2017/8. You can access the data for yourself on the ESRC web page. This data is linked as ‘Application and success rate data (2011-12 to 2017-18)’ and was published in ODS format in May 2018. For ease of reading I’ve hidden the results from individual years.

Lots of caveats here. Unsuccessful outline proposals aren’t included (as no outline application leads directly to funding), but ‘office rejects’ (often for eligibility reasons) are. The ‘core discipline’ of each application is taken into account – secondary disciplines are not. The latest figures here are from 2017-2018 (financial year), so there’s a bit of a lag – in particular, the influence of the Global Challenges Research Fund (GCRF) or Industrial Strategy Challenge Fund (ISCF) will not be fully reflected in these figures. I think the ‘all data’ figures may include now-defunct schemes such as the ESRC Seminar Series, though I think Small Grants had largely gone by the start of the period covered by these figures.

Perhaps most importantly, because these are the results for all schemes, they include targeted calls which will rarely open to all disciplines equally. Fortunately, the ESRC also publishes similar figures for their open call (Standard) Research Grants scheme for the same time period. Note that (as far as I can tell) the data above includes the data below, just as the ‘all data’ column (which goes back to 2011/2) also includes the three year total.

This table is important because the Research Grants Scheme is bottom-up, open-call, and open to any application that’s at least 50% social sciences. Any social science researcher could apply to this scheme, whereas directed calls will inevitably appeal only to a subset. These are the chances/success rates for those whose work does not fit squarely into a directed scheme and could arguably be regarded as a more accurate measure of disciplinary success rates. It’s worth noting that a specific call that’s very friendly to a particular discipline is likely to boost the successes but may decrease the disciplinary success rate if it attracts a lot of bids. It’s also possible that major targetted calls that are friendly to a particular disciplin may result in fewer bids to open call.

To be fair, there are a few other regular ESRC schemes that are similarly open and should arguably be included if we wanted to look at the balance of disciplines and what a discipline target might look like. The New Investigator Scheme is open in terms of academic discipline, if not in time-since-PhD, and the Open Research Area call is open in terms of discipline if not in terms of collaborators. The Secondary Data Analysis Initiative is similarly open in terms of discipline, if not in terms of methods. Either way, we don’t have (or I can’t find) data which combines those schemes into a non-directed total.

Nevertheless, caveats and qualifications aside, I think these two tables give us a good sense of the size of prize available for each discipline. There’s approxinately 29 per year (of which 5 open call) for Economics, and 11 per year (of which 2 open call) for Business and Management. Armed with that information and a knowledge of the relative strength of the discipline/school in our own institution, we ought to get a sense of what a realistic target might look like and a sense of how well we’re already doing. Given what we know about our expertise, eminence, and environment, and the figures for funded projects, what ought our share of those projects be?

We could ask a further question about how those successes are distributed between universities and about any correllation between successes and (unofficial) subject league tables from the last REF, calculated on the basis of Grade Point Average or Research power. However, even if that data were available, we’d be looking at small numbers. We do know that the ESRC have done a lot of work on looking at funding distribution and concentration and their key findings are that:

ESRC peer review processes do not concentrate funding to a degree greater than that apparent in the proposals that request the funding.

ROs which apply infrequently appear to have lower success rates than do those which are more active applicants

In other words, most universities typically have comparable succcess rates except that those that apply more often do a little better than average, those who apply rarely do a little worse. This sounds intuitively right – those who apply more are likely more research-active, at least in the social sciences, and therefore more likely to generate stronger applications. But this is at an overall level, not discipline level.

I’d also note that we shouldn’t only measure success by the number of projects we lead. As grants get larger on average, there’s more research income available for co-investigators on bids leds elsewhere. I think a strategy that focuses only on leading bids and being lead institution neglects the opportunties offered by being involved in strong bids led by world class researchers based elsewhere. I’m sure it’s not unusual for co-I research income to exceed PI income for academic units.

I’ve not made any comment about the different success rates for different disciplines. I’ve written about this already for many of the years covered by the full data (though Alex Hulkes has done this far more effectively over the last few years, having the benefit of actual data skills) and I don’t really want to cover old ground again. The same disparities continue much as before. Perhaps GCRF will provide a much-needed boost for Education research (or at least the international aspects) and ISCF for management and business research.

Maybe.

Summary-time, and the writing ain’t easy…

A version of this article first appeared in Funding Insight in March 2019 and is reproduced with kind permission of Research Professional. For more articles like this, visit www.researchprofessional.com

“…. the words”.

All funding schemes have a summary section as an essential part of the application form. On the UK research councils’ JeS form, the instruction is to “[d]escribe the proposed research in simple terms in a way that could be publicised to a general audience”.

But the summary is not just about publicising your research. The summary also:

  • primes your reader’s expectations and understanding of your project
  • helps the funder to identify suitable experts to review your application
  • gives the reviewer a clear, straightforward, and complete overview of your project
  • helps your nominated funding panel introducer to summarise your application for the rest of the decision-making panel
  • acts as a prompt for the other panel members to recall your application, which they may have only skim-read
  • can be used once your project is successful, for a variety of purposes including ethics review; participant recruitment; impact work.
  • will appeal to non-academic panel members, especially for Leverhulme Trust applications.

How to make a mess of a summary

There are three main ways to make a mess of your summary.

1. Concentrating on the context – writing an introduction, not a summary.

I’ve written before about using too much background or introductory material (what I describe as ‘the Star Wars error’) but it’s a particular problem for a summary. The reader needs some context and background, but if it’s more than a sentence or two, it’s probably too much. If you don’t reach the “this research will” tipping point by a third of the way through (or worse, even later), there’s too much background.

2. Writing to avoid spoilers – writing a blurb not a precis

I really admire film editors who produce movie trailers: they capture the essence of the film while minimising spoilers. However while a film trailer for The Sixth Sense, The Usual Suspects, or The Crying Game should omit certain key elements, a project summary needs to include all of them. An unexpected fifth-act twist is great for film fans, but not for reviewers. Their reaction to your dramatic twist of adding a hitherto unheralded extra research question or work package is more likely to earn you bafflement and a poor review than an Oscar.

3. Ignoring Plain English

The National Institute for Health Research’s Research for Patient Benefit form asks for a “Plain English summary of research”. As a former regional panel member, I have read many applications and some great examples of Plain English summaries of very complex projects. I have read applications from teams that have not tried at all, and from those whose commitment to Plain English lasts the first three paragraphs, before they lapse back.

Writing in Plain English is hard. It involves finding a way to forget how you usually communicate to colleagues, and putting yourself in the situation of someone who knows a fraction of what you do. Without dumbing down or patronising. If it wasn’t hard to write in Plain English, we wouldn’t need the expert shorthand of specialist language, which is usually created to simplify complicated concepts to facilitate clear and concise communication among colleagues.

Very few people can write their own Plain English summary. It’s something you probably need help with, and your friendly neighbourhood research development officer might be well placed to do this – and might even draft it for you.

Incidentally, with NIHR schemes, beware not only of using specialist language, but also using higher-reading-level vocabulary and expressions when more simple ones will do. There’s no need for a superabundance of polysyllabic terminonlogy. The Levehulme Trust offers some useful guidance on writing for the lay ‘lay reader’.

When to write it

Should you write the summary when you start the application, or when you’ve finished it? Ideally both.

You should sketch out your summary when you start writing – if you can’t produce a bullet point summary of your project you’re probably not ready to write it up as an application. Save plenty of time at the very end to rework it in the light of your completed project. Above all, get as much outside input on your summary as you can. It is the most important part of the application, and well worth your time and trouble.

How can we help researchers with grant applications? The contribution of Research Development professionals

A version of this article first appeared in Funding Insight in February 2019 and is reproduced with kind permission of Research Professional. For more articles like this, visit www.researchprofessional.com

Duck/Rabbit, Joseph Jastrow (1899).

You are the academic expert, in the process of applying for funding to make a major advance in your field. I am not. I am a Research Development Manager – perhaps I have a PhD or MPhil in a cognate or entirely different field, or nothing postgraduate at all. How can I possibly help you?

The answer lies in this difference of experience and perspective. Sure, we may look at the same things, but different levels of knowledge,understanding – as well as different background assumptions –  mean we find very different meanings in them. We all look at the world through lenses tinted by our own experiences and expectations, and if we didn’t, we couldn’t make sense of it.

Interpreting funding calls

When academics look at funding calls, they notice and emphasise the elements of the call that suit their agenda and often downplay or fail to notice other elements. Early in my career I was baffled as to why a very senior professor thought that a funding call was appropriate for a project. He’s smarter than me, more experienced…so obviously I assumed I’d got it wrong. I went back to the call expecting to find my mistake and find that his interpretation was correct. But no…my instinct was right.

Since then I’ve regularly had these conversations and pointed out that an idea would need crowbarring to death to fit a particular call, and even then would be uncompetitive. I’ve had to point out basic eligibility problems that have escaped the finely-honed research skills of frightening bright people.

When research development professionals like me look at a funding call, we see it through tinted glasses too, but these are tinted by comparable calls that we’ve seen before. We see what has unusual or disproportionate emphasis or lack of emphasis, or even the significance of what’s missing. We know what we’re looking for and whereabouts in the call we expect to see it.  Our reading of calls is enhanced by a deep knowledge of the funder and its priorities, and what might be the motivation or source of funding behind a particular call.

Of course, some academics have an excellent understanding of particular funders. Especially if they’ve received funding from them, or served on a panel or as a referee, or been invited to a scoping workshop to inform the design and remit of a funding call.

But if you’re not in that position, the chances are that your friendly neighbourhood research development professional can advise you on how to interpret any given funder or scheme, or put you in touch with someone who can.

We can help you identify the most appropriate scheme and call for what you want to do, and just as importantly, prevent you from wasting your time on bids that are a poor fit. Often the best thing I do on any given day is talking someone out of spending weeks writing an application that never had any realistic chance of success.

Reading draft applications

You must have internal expert peer review and encourage your academic colleagues to be brave enough to criticise your ideas and point out weaknesses in their iteration. Don’t be Gollum.

Research Development professionals can’t usually offer expert review, but a form of lay review can be just as useful. We may not be experts in your area, but we’ve seen lots and lots of grant applications, good and bad. We have a sense of what works. We know when the balance is wrong. We know when we don’t understand sections that we think we should be able to understand, such as the lay summary. We notice when the significance or unique contribution is not clearly spelled out. We know when the methods are asserted, rather than defended. We know when sections are vague or undercooked, or fudged. Or inconsistent. When research questions appear, disappear, or mutate during the course of an application.

When I meet with academics and they explain their project, I often find there’s a mismatch between what I understood from reading a draft proposal and  what they actually meant. It’s very common for only 75%-90% of an idea to be on the page. The rest will be in the mind of the applicant, who will think the missing elements are present in the document because they can’t help but read the draft through the lens of their complete idea.

If your research development colleagues misunderstand or misread your application, it may be because they lack the background, but it’s more likely that what you’ve written isn’t clear enough. There’s a lot to be learned from creative misinterpretation.

None of this is a criticism of academics; it’s true for everyone. We all see our own writing through the prism of what we intend to write, not what we’ve actually written. It’s why this article would be even worse without Research Professional’s editorial team.

A Fantastic ‘Funding Friday’ in Finland

Last month Back in February, I was delighted to be invited to give the keynote talk at the University of Turku’s inaugural Funding Friday event. Before the invitation I didn’t know very much about Finland (other than the joke that in Finland, an extrovert is someone who stares at your shoes) and still less about the Finnish research funding environment. But I presumed (largely, if not entirely correctly) that there are a great many issues in common, and that advice about writing grant applications would be reasonably universal.

When someone takes Finnish stereotypes too seriously
Finnish Nightmares, by Karoliina Korhonen

When I reached the venue I was slightly surprised to see early arrivals each sitting at their own individual one-person desk. For a moment I did wonder if the Finnish stereotype was true to the extent that even sharing a desk was regarded as excessively extrovert. However, there was a more obvious explanation – it was exam season and the room doubled as an exam hall.

The Star Wars Error in Grant Writing

I was very impressed with the Funding Friday event. I was surprised to realise that I’d never been to a university-wide event on research funding – rather, we’ve tended to organise on a Faculty or School basis. The structure of the event was a brief introduction, my presentation (Applying for Research Funding: Preparations, Proposals, and Post-Mortems) followed by a panel discussion with five UTU academics who served on funding panels. Maria guided the panel through a series of questions about their experiences – how they ended up on a funding panel, what they’d learnt, what they looked for in a proposal, and what really annoyed them  – and took questions from the floor. This was a really valuable exercise, and something that I’d like to repeat at Nottingham. I’m always trying to humanise reviewers and panel members in the minds of grant applicants and to help them understand the processes of review and evaluation, and having a range of panel members from across academic disciplines willing to share their experiences was fascinating. Of course, not everyone agreed on everything, but there seemed to be relative uniformity across panels and academic disciplines in terms of what panel members wanted to see, what made their jobs easier, and what irritated them and made things harder.

In the afternoon, we had a series of shorter sessions from UTU’s research funding specialists. Lauri spoke about applying Aristotle’s teachings on rhetoric (ethos, pathos, and logos) to structuring research grant proposals – a really interesting approach that I’d not come across before. What is a grant application if not an attempt to persuade, and what’s rhetoric if not the art of persuasion? Anu talked about funding opportunities relative to career stage, and Johanna discussed the impact agenda, and it was particularly fascinating to hear how that’s viewed in Finland, given its growth and prominence in the UK. From discussions in the room there are clearly worries about the balance between funding for ‘blue skies’ or basic research and for applied research with impact potential. Finally, we heard from Samira, a successful grant winner, about her experiences of applying for funding. It’s great to hear from successful applicants to show that success is possible in spite of dispiriting success rates.

To resubmit, or not to resubmit, that is the question

I’d arrived with the assumption that research – like almost everything else in the Nordic social democracies – would be significantly better funded pro rata than in the UK. (See, for example, the existence of an affordable, reliable railway system with play areas for small children on intercity trains). However, success rates are broadly comparable. One significant difference between the UK and Finland funding landscapes is the prevalence of the UK ‘demand management’ agenda. This limits – or even bans – the re-submission of unsuccessful applications, or imposes individual or institutional sanctions/limits on numbers and timing of future applications. The motivating force behind this is to reduce the burden of peer review and assessment, both on funders and on academic reviewers and panel members. Many UK funders, especially the ESRC, felt that a lot of the applications they were receiving were of poor quality and stood little chance of funding.

Finnish funders take an approach that’s more like the European Research Council or the Marie Curie Fellowship, where resubmissions are not only allowed but often seem to be a part of the process. Apply, be unsuccessful, get some feedback, respond to it, improve the application, and get funded second or a subsequent time round. However, one problem – as our panel of panel members discussed – is that panel membership varies from year to year, and the panel who almost nearly funded your proposal one year is not going to be the same panel who reviews the improved version the following year. For this reason, we probably shouldn’t always expect absolute consistency from panels between years, especially as the application will be up against a different set of rival bids. Also, the feedback may not contain the reasons why an application wasn’t funded nor instructions on how to make it fundable next time. Sometimes panels will point out the flaws in applications, but can be reluctant to say what needs to be said – that no version of this application, however polished, will ever be competitive. I’ve written previously about deciding whether to resubmit or not, although it was written with the UK context in mind.

The room was very much split on whether or not those receiving the lowest marks should be prevented from applying again for a time, or even about a more modest limitation on applying again with a similar project. Of course, what the UK system does is move the burden of peer review back to universities, who are often poorly placed to review their own applications as almost all their expertise will be named on the bid. But I also worry about a completely open resubmission policy if it’s not accompanied by rigorous feedback, making it clear not only how an application can be improved, but on how competitive even the best possible iteration of that idea would be.

One of the themes to emerge from the day was about when to resubmit and when to move on. Funding (and paper, and job) rejection is a fact of academic life, calling for more than a measure of determination, resilience, bouncebackability, (or as they say in Finland) sisu . But carried too far, it ends up turning into stubbornness, especially if the same unsuccessful application is submitted over and over again with little or no changes. I think most people would accept that there is an element of luck in getting research funding – I’ve seen for myself how one negative comment can prompt others, leading to a criticism spiral which sinks an initially well-received application. Sometimes – by chance – there’s one person on the panel who is a particular subject expert and really likes/really hates a particular proposal and swings the discussion in a way that wouldn’t have happened without their presence. But the existence of an element of luck does not mean that research funding is lottery in which all you need do is keep buying your ticket until your number comes up. Luck is involved, but only regarding which competitive applications are funded.

I’ve written a couple of posts before (part one, and part two) about what to do when your grant application is unsuccessful, and they might form the beginnings of a strategy to respond and to decide what to do next. At the very least, I think a thorough review of the application and any feedback offered is in order before making any decisions. I think my sense is that in any system where resubmissions are an accepted feature, and where it’s common for resubmissions to be successful, it would a shame to give up after the first attempt. By the twelfth, though…

Watching your language

I was fascinated to learn that responsibility for training in research grant application writing is shared between UTU’s research development team and their English language unit. National funders tend to give the option of writing in English or in Finnish, though writing in English makes it easier to find international referees and reviewers for grant applications – and indeed one of my Business School colleagues is a regular reviewer.

One issue I’m going to continue to think is about support for researchers writing grant applications in their second or additional language. English language support is an obvious service to offer for a university in a country whose own language is not commonly spoken beyond the borders of immediate neighbours, and particularly in Finland where the language isn’t part of the same Indo-European language group as most of the rest of Europe. But it’s not something we think about much in the UK.

I’d say about half of the researchers I support speak English as a second language, and some of the support I provide can be around proof reading and sense-making – expressing ideas clearly and eliminating errors that obscure meaning or which might irritate the reader. I tend to think that reviewers will understand some minor mistakes or awkward phrasing in English provided that the application does not contain lazy or careless errors. If a reviewer is to take the time reading it, she wants to see that the applicant has taken his time writing it.

I think most universities run courses on academic English, though I suspect most of them are designed for students. Could we do more for academic staff who want to improve their academic English- not just for grant writing, but for teaching and for the purposes of writing journal papers? And could we (and should we) normalise that support as part of professional development? Or do we just assume that immersion in an English-speaking country will be sufficient?

However… I do think that academics writing in their second language have one potential advantage. I’ve written elsewhere about the ‘Superabundance of Polysyllabic Terminology’ (aka too many long words) error in grant writing, to which native English speakers are more prone. Second language academics tend to write more clearly, more simply, and more directly.  Over-complicated language can be confusing and/or annoying for a native English speaker reviewing your work, but there’s a decent chance that reviewers and panel members might speak English as a second language, who will be even more irritated. One piece of advice I once heard for writing EU grant applications was to write as if your application was going to reviewed by someone reading it in the fourth language while waiting to catch their flight. Because it might well be.

It was a real honour to visit Turku, and I’d have loved to have stayed longer. While there’s  a noticeable quietness and a reserve about Finnish people – even compared to the UK – everyone I met couldn’t have been more welcoming and friendly. So, to Soile, Lauri, Anu, Johanna, Jeremy, Samira, the Turku hotel receptionist who told me how to pronounce sisu, everyone else I met, and especially to Maria for organising …. kiittos, everyone.