Postdoc Fellowships: Should I Stay or Should I Go?

A version of this article first appeared in Funding Insight in March 2022 and is reproduced with kind permission of Research Professional. For more articles like this, visit www.researchprofessional.com

Is relocation always advisable for a postdoc fellowship, and what if it’s not possible?

Most postdoctoral fellowship programmes encourage potential applicants to move institutions, though the strength of that steer and the importance placed on researcher mobility varies from scheme to scheme. At the extreme end, in Europe, the Marie Curie Fellowships programme (not exclusively a postdoc scheme) requires international mobility for eligibility.

“Until tomorrow, the whole world is my home…”

In the UK, most schemes have softened their steer over recent years. Where once staying at your current institution required ‘exceptional justification’ or some similar phrasing, there’s now an increasing awareness that researcher mobility doesn’t make sense for everyone and enforcing it has negative ramifications for equality, diversity and inclusion (EDI). It’s much harder and more disruptive for researchers with family commitments to move institutions, and harder for those with partners who are tied to a particular location for family or job reasons. There will be other researchers who are already in the best environment for their research, and so any move would be a backward step. It’s now common for application forms to allow space for both (a) personal/EDI reasons why moving institutions is not possible; and (b) intellectual/research reasons for not wanting to move.

However, there is still a fear that whatever the guidance notes may say, the reality is that reviewers still expect researchers to move for a postdoc fellowship. Or that competitive pressures and limited funds may make it harder for non-mobile proposals to be scored high enough to cross the threshold. It’s not obvious that an exceptional researcher with an exceptional project in a mediocre environment (for whatever reason) could be competitive against rivals who were judged exceptional across all three categories (person, project, place). Even if that researcher had very sound EDI-related reasons for not moving. It’s a tricky issue and there’s no obvious solution, other than a lot more money for fellowships.

It’s worth noting in passing that just because a reviewer has said something, doesn’t mean that the panel paid it any heed. A reviewer may think mobility ought to be compulsory, but the panel will ignore them if that’s not the scheme rule.

Well, there are worse earworms to have…

Why move?

Why do funders want researcher mobility? Funders will say that it’s a good thing, but the reasons are rarely fully articulated. I think there are at least four reasons to look to move:

  • It will grow your network. You already have your contacts and collaborators at your current institution, and any from any previous institutions. Moving institution will lead to an introduction to new research groups with different facilities. You can grow your network from one place, but it’s hard to replicate the dramatic network expansion from moving.
  • It will expose you to a different culture and way of working. Even if some things will be better, some worse, it all contributes to intellectual and professional enrichment. If you’ve not moved, it’s easy to think that there are no alternative ways of working when a problem arises.
  • It will allow you to reinvent yourself. If you’re working with researchers who remember you as a PhD student, or even an undergraduate, it’s difficult for colleagues not to continue to see you that way. I know of a few people who’ve been ‘lifers’ at a single institution and experienced a huge rise in their status in the new institution after moving, because their new colleagues have only ever seen them as a dynamic young researcher.
  • It will boost your progression towards independence. Sitting in the same lab with the same people, it’ll be very hard to move out of their shadow. Especially if they’re very senior.

Should I move?

The world’s greatest ever TV theme tune, folks.

Probably yes. Unless you have personal reasons that make moving difficult or impossible, or you’re confident that you’re already in the best place to undertake your research. One factor to consider is how mobile you’ve already been between undergraduate studies and now. The less you’ve moved, the greater the benefits to move now.

Don’t feel disloyal about moving. Good researchers and mentors know that mobility is a good thing for your development, and that your move could potentially strengthen their links with your target institution and boost collaboration. What’s more, your institution is talking to PhDs and postdocs from other institutions about fellowships. This is how things work.

Hopefully you’ll already know people who work at your target institution, and they’ll be able to point you in the right direction. If you don’t, that makes life harder. You could ask colleagues for an introduction and a recommendation, or send your CV and a proposal to the research group you’d like to work with. Copy in a research manager or administrator. They can only say no. Or not reply at all.  But good research groups will be delighted to hear from talented researchers who work in a relevant area who are willing to apply for a fellowship.

It’s important to make contact early. You’re not going to get a warm reception if you contact the institution a few weeks before the deadline. They will want to help you shape and improve your proposal, and there will be costings and approvals to agree. Your current host institution can’t help you apply elsewhere; the responsibility is all with the new host.

What if I can’t—or justifiably don’t want to—move?

A few Google searches might tell you how many successful candidates in the fellowship scheme’s last round moved, and how many stayed where they were. If not, you could ask a friendly neighbourhood research development manager if anyone has looked at this before.

If there are at least some successes, you should attempt to address the non-mobility question throughout the application, not just in the boxes where you’re specifically asked about it. If there’s a presumption in favour of moving, and you’re not moving, you need to show that you’ve got a solid plan to achieve as many of the benefits of mobility as possible.

  • Have you moved already? If so, look for a way to stress this and explain how you’ve benefited. Don’t just rely on reviewers seeing it in your career history (that’s often a section that’s skim-read).
  • Can you be mobile within an institution? If you’re moving to a new research group, or your work bridges your old group and new one, you can present that as both a form of mobility and evidence of your pathway towards independence. On that note, no-one is saying that you’re never allowed to speak to your old mentor/supervisor again. But can you put some (physical, intellectual, organisational) distance between the two of you in the application? Can you foreground the collaborations you’ve built, the talented researchers who’ve worked specifically with you?
  • Make a positive case for your current research environment. If it has the right equipment, resources, facilities, collaborators, say so. Don’t merely make the ‘negative’ case for why mobility is difficult or impossible for you. Reviewers don’t need persuading that you’re telling the truth there. Instead, persuade them that your current research environment is outstanding.
  • Can you visit other institutions as part of your fellowship? The factors that make moving institutions difficult presumably also make extended visits difficult too. But could you spend a month (or longer) at another research group (maybe even internationally) to, for example, learn a new technique or expand a new collaboration? Even micro-visits can be useful.
  • Have a plan to expand your (academic and non-academic) networks. This could be conference attendance (real or virtual), it could be greater visibility on social media or other channels of communication. It could be volunteering to organise your School’s seminar series. These are all ways of ensuring that you get at least some of the network-expanding benefits of changing institution without actually changing institution.

Better research culture: Some thoughts on the role of Research Development Managers

The Association of Research Managers and Administrators (ARMA) held their annual conference back in November 2022. I was lucky enough to have a submission for an on-demand webinar accepted on the topic of research culture, and in particular on the role of Research Development Managers.

The talk covers ways in which Research Managers (and those in similar roles) can improve research culture, first through our own policies and practices, and second, through positively influencing others. I also (briefly) discuss writing ‘research culture’ into funding applications, before making some final predictions about what might the future might hold as regards research culture.

The recording – it’s about half an hour or so of your life that you won’t get back. Because that’s how time works.

The recording features me making a mess of trying to describe myself (not having had to do that before), and includes a few brief references to the broader conference. In my presentation, I assume that copies of my slides will be circulated, but I’ve no idea if they ever were, and if you’re watching now, you certainly won’t have them. That being so, here are the key links from the session.

So you’re new to… UK research funding

A very brief tour of the UK research funding landscape.

A version of this article first appeared in Funding Insight in November 2021 and is reproduced with kind permission of Research Professional. For more articles like this, visit www.researchprofessional.com.

Originally published in two parts, I’ve merged them into one and lightly edited to update and (in the case of EU funding) to try to future-proof!

Paddington (2014)

This article is intended for researchers who have moved to UK academia recently (welcome!) and for UK researchers in the very early stages of their careers. My aim is to give a very brief tour of the UK research funding landscape and help you get to grips with some of the terminology. In part one, I’ll look at government or public funding and say a bit about different funding models for research. In part two I’ll touch on research charities, learned societies, EU funding, and conclude with some general advice on finding research funding opportunities.

The ‘Dual Support’ system for Research and QR Funding

The UK has a dual support system for the public funding of research. The first element is a ‘block grant’—basically a huge chunk of cash—given to UK universities to spend on research as they see fit. The second (which I’ll come to shortly) is support for specific research projects through competitive peer-review processes.

Most of this block grant is Quality-Related (QR) funding, which is allocated to universities on the basis of their research performance as measured through the last Research Excellence Framework (REF). The REF is a huge evaluation exercise that takes place every seven or so years, most recently in 2021. Although we’re well into what would be the new ‘REF cycle’, we don’t yet know what the rules will be for this round, and things could be radically different. Or very similar. At the time of writing, we don’t know.

Alan Partridge (Steve Coogan) doesn’t know anything about the REF either

Universities can spend QR funding on pretty much any research purpose. Typically, it’s used to support academic salaries, research infrastructure, and internal ‘seed funding’ (for smaller, early-stage research projects). It’s a vital source of stable, predictable, flexible core funding for research. Its importance shouldn’t be underestimated.

Although everyone approves of QR funding, you’ll struggle to find many people with a good word about the REF. I did have a go at a partial defence once, pointing out some of the inconsistencies in some of the critiques, which is still the case. Although it’s primarily about the distribution of QR funding, the REF is also used within universities to check up on the performance of constituent schools and research groups. Individual researchers’ contributions towards the ‘REF return’ are also often assessed.

While the REF has many detractors, there is little agreement about what might take its place. The REF deserves its own article, but as yet another review is underway at the time of writing, there’s no point in writing it.

Competitive funding for projects – UKRI

A bus heading for Swindon
UKRI is based in fashionable Swindon, and its HQ has a secret entrance to the railway station.

Competitive funding awarded for specific projects or programmes of work are the second arm of the ‘dual funding’ system. Most publicly funded competitions for academic research grants in the UK are run via an organisation called UK Research and Innovation (UKRI) which is made up of nine funding bodies: seven research councils, a body called Innovate UK which is involved with R&D in commercial contexts, and another called Research England which, among other activities, helps develop and implement the REF.

Of those nine constituent bodies, the research councils are probably the most important for academic researchers to learn about and understand.

The research councils are:

  • Arts and Humanities Research Council (AHRC)
  • Biotechnology and Biological Sciences Research Council (BBSRC)
  • Economic and Social Research Council (ESRC)
  • Engineering and Physical Sciences Research Council (EPSRC)
  • Medical Research Council (MRC)
  • Natural Environment Research Council (NERC)
    • Science and Technology Facilities Council (STFC)

As you can tell, each of the seven councils has a disciplinary remit that it carries in its name, except for the Science and Technology Facilities Council, which supports research in astronomy and space-related science. It’s also worth mentioning that academic researchers might be involved in grant bids to Innovate UK, but these projects will need to be industry-led or have strong industrial partnerships.

Until 2018, each council was a largely separate identity with a small coordinating/umbrella body called Research Councils UK. Partially in order to encourage interdisciplinary research, UKRI was created with a remit for more active stewardship and coordination.

Each council runs its own funding calls for specific projects, usually a mixture of directed calls on specific issues and responsive mode funding which is open to any discipline within their remit. Each council will have a more-or-less predictable annual cycle of schemes alongside one-off or occasional calls on specific priorities. Some schemes will have specific deadlines, while others will be ‘open call’ – accepting applications at any time. Confusingly, the phrase ‘open call’ is also sometimes used to mean responsive move – open to any topic. The research councils have the most money and should be your first port of call when looking for funding.

Under the long-established Haldane principle, funding decisions on individual research projects are taken by experts, not by government. Although the government has a role in strategic direction and budget allocations, the research councils are autonomous. UKRI is an ‘arms-length body’—that is, government is supposed to keep a safe distance away from its day-to-day functioning, and therefore UKRI’s funding decisions never have to be signed off by a government minister.

In theory, it shouldn’t be possible for proposals to fall between different research councils with neither willing to take ownership. Remit checks are available, and you should take advantage of this if your work could interest two or more Councils or if you are unsure where it fits. Frequently different research councils will collaborate on grant calls with a specific interdisciplinary purpose.

Funders Future, Funders Past

You might hear about the Advanced Research and Invention Agency (Aria), which I’ve not included in the council list on the grounds that it doesn’t exist yet, and if/when it does exist, it’s likely to be independent of UKRI. which now does exist and is indeed independent of UKRI. The ambition for Aria is to be a UK equivalent of Darpa in the US, funding “high risk, high reward” research. It’ll do this by appointing academic programme directors to run particular themed funding calls. While those working in universities welcome more research funding, opinion is divided about the merits and demerits of proposed governance arrangements and whether Aria really needs to be a separate organisation.

Speaking of things that don’t exist, you might also hear about the Global Challenges Research Fund (GCRF). This was programme of applied research to support international development, funded from the UK’s international aid budget. But government cuts to the budget brought the scheme to a juddering halt, leading to the curtailment or cancellation of key research projects in some of the world’s poorest countries. Government reneging on funding commitments is widely regarded by researchers as a national disgrace. Even if GCRF returns, trust has been shattered.

Funding Models

UKRI funding is highly prized by UK universities because it pays Full Economic Costs (fEC). I’ve written a separate article detailing how fEC works, but all you need to know for now is that it’s the most attractive financial deal for research because, as the name implies, it means that all the costs of undertaking the research are considered. It’s important to note that a successful grant application will not directly affect your personal salary, though bringing in research funding will strengthen any case for promotion.

Other funders such as charities tend not to pay overheads (contributions towards the costs of running a university) or salary costs for investigators, funding only the directly incurred costs of the research. Fortunately, the government makes an award worth approximately 19% of award value for eligible charity funding through a separate budget line of QR funding.

Even with QR funding and fEC overheads, funding for university research doesn’t come near to covering its share of the costs. In practice, university research is subsidised from other sources, such as teaching income (especially overseas students) and conference and other commercial income.

It’s also worth drawing a distinction between two different categories of research funding – project grants and fellowships. Many funders offer both. Projects are about a particular programme of work, often with multiple co-investigators. Fellowships are about the research too, but they’re also more focused on the individual researcher. At earlier career stages they focus on the personal and professional development of the fellow as well as producing the research findings. At mid and later career stages, they can be about a range of projects or activities carried out by the fellow. Fellowships may involve mentors and collaborators, but usually not co-investigators.

In part two, I’ll touch on NIHR, research charities, learned societies, EU funding, and conclude with some general advice on finding research funding.

European Funding – Horizon Europe

Although the UK has left the European Union, the UK and the EU have agreed that the UK will continue to participate in the EU’s research funding schemes as an ‘Associated Country’. There are already several Associated Country participants including Norway, Turkey, and Switzerland. At the time of writing, the details have yet to be finalised. The UK government has set aside a budget which will fund UK participation in Horizon Europe schemes, including the European Research Council (for frontier research) and Marie Sklodowska-Curie Actions (researcher training and development).

Whatever I write here risks being out of date by the time I press ‘publish’, but at the moment it looks like the UK is back on track to rejoin Horizon Europe after progress was finally made on the Northern Ireland protocol. There are still complex negotiations to take place about funding shares, but prospects are looking much brighter than before, where some nebulous ‘Plan B’ alternative was being discussed.

In the weeks and months after Brexit, there were fears that Brexit might have a ‘chilling effect’. While remaining technically eligible, the initial concern was that applications led from the UK or involving the UK would be reviewed less favourably. However, there’s been no evidence of this and in fact the UK continued to vie with Germany as host of most prestigious ERC grants in the final calls under Horizon 2020 at a time when Brexit was in full flow, with the UK’s success rates improving in this competition.

However, there may yet be an effect: if UK-based researchers stop applying for funding, there is a risk that it may become a self-fulfilling prophecy. Even if the politics have changed, geography hasn’t. The UK is still a major research powerhouse and our European colleagues still want to work with us.

Major Charitable Trusts

The ‘trusts’ are sometimes regarded as quasi-research councils such is the amount of money they have to spend. Both Trusts are funded by investment income – for Leverhulme, a large shareholding in Unilever, and for Wellcome from a portfolio purchased with the proceeds of the sale of Wellcome PLC to what is now GlaxoSmithKline. Both run their own schemes and partner with other funders.

The Leverhulme Trust funds research in any academic discipline apart from medical research and are a particularly important funder for humanities and social sciences.  The Trust offers a suite of standard schemes including project grants and Fellowships at various career stages, which run on an annual basis. They are particularly interested in fundamental/basic/blue skies research and interdisciplinary research. If your project falls between two or more disciplinary stools, is a passion project, is heterodox, and high-risk high-reward, the Leverhulme Trust is well worth a look. Leverhulme also runs larger strategic schemes every few years to which each university can only submit a single application,

The Wellcome Trust funds research into health and wellbeing, including humanities and social science research. They fund work into fundamental biological processes; complexities of human health and disease; and tools, technologies and techniques to benefit health research. They don’t fund translational research or developing/testing/implementing treatments or interventions. With their new strategy, Wellcome have moved to funding longer and more expensive Fellowships and Projects, which has in turn raised their expectations for successful projects.

Wellcome are also partners in a separate, US-based organisation called Wellcome Leap, which (a little like ARIA) draws inspiration from DARPA and funds use-inspired research in the field of human health. They issue complex calls with hyper-short deadlines and turnaround times, usually setting out a programme to be achieved and inviting expressions of interest to participate and contribute towards specific programme goals.

Learned Societies and Academies

These are scholarly societies with royal charters and charitable status which offer research funding, either from private investments, donations, or government funding. The Royal Society funds natural sciences, the British Academy funds humanities and social sciences. The Royal Academy of Engineering and the Academy of Medical Sciences have remits that are more easily guessable.

While they don’t have much money compared to UKRI, they’re often good for Fellowships and for funding for smaller projects which may fall below the minimum funding floor of the relevant research council.

National Institute of Health Research (NIHR)

The NIHR spends public money on research for the benefit of the UK National Health Service (NHS), public health, and social care. More applied and translational than both the Medical Research Council and the Wellcome Trust, the NIHR has a wide range of programmes including Health Services and Delivery; Health Technology Assessment; Research for Patient Benefit; and Efficacy and Mechanism Evaluation.

Other charities

Most charities that fund research are medical charities, including Cancer Research UK, the British Heart Foundation, and Versus Arthritis. But there are a lot of smaller charities too, and it’s a complex picture. A good starting point is the membership list of the Association of Medical Research Charities (AMRC). They’re an umbrella body that supports member charities, and require certain standards of peer review, transparency in decision making etc of their members.

Finding Funding

If your institution subscribes to Research Professional, you should set up an email funding alert based on your interests. There are a lot of niche/discipline specific funders that I’ve not mentioned here, and this is an excellent way of finding them. You should also sign up to newsletters from key funders in your discipline area, and/or follow them on Twitter.

You should also talk to your local Research Development Manager and to your new colleagues. You’re not alone in your quest for research funding – they’ll have a lot of experience and could save you a lot of time in finding the best funder and scheme for your ideas.

In the UK, as elsewhere, success rates for grant applications tend to be low. They obviously vary, but generally 25% is regarded as pretty good. There are a lot more good ideas than there is funding available. Putting together a competitive grant application is a major undertaking, so it’s important to consider all of the available options to find the most appropriate funder and scheme. It’s tempting to pounce on the first scheme you see. Don’t. Take your time and get advice to find the right one for you.

Research Grant Application Success rates: An optimist writes….

A version of this article first appeared in Funding Insight in October 2019 and is reproduced with kind permission of Research Professional. For more articles like this, visit www.researchprofessional.com

Success rates for many research funding calls may be low, but a quality, competitive application’s chances of success will be much higher. Adam Golberg tries to look on the bright side of life…

Dark Elf Dice, CC BY-SA 4.0 https://creativecommons.org/licenses/by-sa/4.0, via Wikimedia Commons

When analysing a funding call and deciding whether to apply, it’s always worth finding out the success rate from previous rounds. Some funders are better than others in terms of publicising success rates. Some won’t share them at all, others will hide them away in annual reports, others will publish a lot of details and data, but on relatively hard to find pages on their website. Or they’ll conflate outline and full application stage success rates. If you can’t find success rates easily, ask your friendly neighbourhood research development professional.

One-off or new calls might specify a total budget or expected number of projects to be funded, but obviously won’t have success rates. Changes to funding schemes can make comparisons with previous years less useful, and with multiple stage schemes (outline, full, and perhaps an interview), it’s probably the success rate at each stage that’s most useful to know. Where calls don’t have success rates – and often even when they do – there will usually be details of approximately many awards will be made, or what kind of budget is available for this call.

These success rates and numbers of projects likely to be funded are likely to be depressing – success rates in single digits, in the most extreme cases. But don’t get discouraged too quickly.

Overall scheme success rate vs. competitive application success rate.  

I’d argue that it’s worth thinking in terms of two different success rates. The first is the statistical success rate – total number of awards divided by the total number of applications. I’d argue that there’s a second success rate – the number of awards divided by the number of fundable applications.

What makes an application ‘fundable’?

  • Eligibility: not eligible = automatically unsuccessful.
  • Significance and competitiveness – not merely of relevance to the remit of the call. It must have the clear potential to make a significant contribution to the goals and objectives of the call at the scale expected.
  • Feasibility – in terms of methods, access to data, power calculations, management plan, relations with partners, budgets/resources. Can this be done as proposed?
  • Consistency – research questions don’t mutate or appear and disappear, different sections of the application reinforce rather than contradict each other
  • Clarity – if your application is unclear, you risk referees choosing the least sympathetic reading of any sections that are ambiguous or under-specified. Worse, they might conclude that you haven’t thought it through. Your proposal should have been through multiple drafts and checked repeatedly.

If your application ticks all of these boxes, you probably have a competitive application, and ‘your’ likely success rate could well be double the overall success rate. The overall success rate includes rushed or undercooked applications; the crowbarred-to-fit-the-remit; the ineligible; the incomprehensible; the only-incremental-progress; the only-submitted-to-appease-the-Head-of-School. The fundamentally misconceived; the lacking in novelty; the missing key elements of the literature.

Two reasons not to get too excited – the first is that even double the standard success rate means the odds are very much against you for the majority of funding calls. The second is that most applicants think that their application ticks all the boxes, and won’t number among the unfundable driving down the overall success rate. Probably a few people are knowingly risking a long shot for better reasons or for worse, but most ought to be confident in their proposal.

So how do you tell if you have the potential to submit a competitive, fundable application? Well, the fact that you’re thinking of research funding as a competition is a good start. Probably the best way is to get external input – from your Research Development Manager  (or equivalent) and from senior academic colleagues – at the earliest possible stage. It’s impossible to read a funding call without seeing it through the tinted lenses of your own research ideas and your own expectations. Then you need to take a realistic view about your starting point in terms of the development of the ideas and the team, what the application form requires, the likely success rate for quality applications, the time and energy you and your team have available, and what else you might have done with that time.

One off calls – find the size of prize

One bit of advice I used to give was to see how many projects or fellowships are likely to be funded, and then come to a view about whether your proposal is likely to be competitive in terms of significance.  If there are twenty early career fellowships available, are you likely to be among the twenty strongest applicants in terms of track record and quality and significance of your proposal?

However, I now think the question to ask is subtly different. Is your application likely to be among the twenty strongest who will actually apply, rather than among those who might conceivably apply? There will always be a proportion of potentially strong rivals who don’t apply for whatever reason – they don’t have the time; they don’t have the energy; it’s the wrong stage in the research cycle; they don’t know about the call; or they have other irons in the fire.

Another reason why I no longer pose the question so bluntly is in response to an outstanding early career researcher pointing out to me that it might encourage the wrong people and discourage the right people. The Dunning-Kruger Effect is the tendency of the skilled in any particular task to underestimate their own skill and overestimate the skill of others, and while those lacking in skill overestimate their own abilities and find it harder to recognise genuine skill among others. So those who aren’t outstanding candidates are more likely to wrongly believe they are, while those who might are more likely to doubt themselves.

Reasons to be cheerful, part III

Somebody has to win. Individuals and teams are winning those grants. Yes, there is an element of luck involved – which referees are selected, who is on the panel, who speaks up for/against your proposal, what rival bids propose and whether that complements or conflicts. But there’s little you can do about any of that. Your job is to make sure, when deciding to apply – that you can produce a competitive application by the deadline. An eligible, feasible application offering a significant contribution, speaking loudly and clearly to the remit, and written up with clarity and consistency. An application that has every chance of clearing every hurdle and still being in contention on the final straight. Manage that, and you can expect ‘your’ success rate to be significantly better than the scheme average.


My defined contribution to the UCU strike ballot debate

At the time of writing, UCU are balloting on strike action in response to (among other things) draconian cuts to USS. My gut reaction is also three letters…. FFS.

At the time of writing, UCU are balloting on strike action in response to (among other things) draconian cuts to USS. My gut reaction is also three letters…. FFS.

Though – spoiler alert – I am going to be voting for strike action very reluctantly and with a very heavy heart.

“Freedom for the University of Tooting!”

I wrote a post about strike action and the importance of union membership back in 2013 and on the pensions strike back in 2018. I think both posts hold up pretty well. But briefly, and contrary to popular demand, here are all the things I think about pensions.

  • Pension planning seems like a technocratic problem that ought to have a technocratic solution. Or, more properly, a range of technocratic solutions to chose from, depending on our priorities and preference. Which, again, we can talk about.
  • Is there is a genuine problem with the pension scheme that’s not been resolved by all of the many, many previous pension cuts we’ve had since I signed up to USS about twenty years ago? Each time we were promised that this cut would resolve the genuine problems with our pension scheme. Each time it hasn’t. If there is still a problem, this ought to be understandable and communicable. And something that can be negotiated about, around, and through.
  • But UUK/university management has made this impossible through failures of transparency, dubious consultations, and a level of spin that borders on the Trumpian. It’s all massively counterproductive – we’re not stupid, so stop treating us as if we are. Those minded to at least entertain the thought that there’s an issue with our pension scheme don’t trust UUK, because they have acted – and continue to act – in bad faith.
  • My suspicion is that they’ll be back again, and again, and again, and again for as long as they can get away with it. Same arguments each time. Back in 2018 we needed draconian cuts, apparently, and then after sustained industrial action, we didn’t any more. It’s almost as if… etc and so on. Universities may not be for-profit, but university management wants surpluses for reinvestment in their pet projects (at least some of which are genuinely good ideas) because they tend to want to make their mark. So it is in their interests to drive costs down as low as possible and keep them there.
  • Colleagues not paying into their pensions because contributions are too high is a problem with our pension. Even if framing the issue as if this was the sole consideration and not mentioning the, you know, massive cuts is disingenuous in the extreme.
  • Pensions are not a perk, but deferred salary. Organisations whose continued existence is very certain (broadly, public services) are in a position to provide better pensions. As a trade-off, salaries are lower. We knew this when we chose our careers and expect the deal to be honoured. Why should we have better pensions that some other sectors? Because that was always part of the deal.
  • I hate being on strike. I hate the arranging to have some of my work covered with colleagues who are themselves busy, especially when we’re several posts down. I hate the divisions it causes. I hate the stress it imposes and the difficult decisions about who to let down and how. I hate coming back to work to find that I’ve got a huge backlog that only I can clear. I don’t like not getting paid, and essentially having to work for free to catch up.
  • Some people like being on strike and the conflict and the associated rituals and the ‘winter of discontent’ cosplaying just that little bit too much.
  • Media coverage of all industrial action is always disgracefully one-sided. Management want ‘reform’ and is presented positively… management talking points always lead and they are never challenged by the reporter. Workers are striking for ‘pay and conditions’… presented selfishly or short-shortsightedly. And are always challenged. The framing is always that of management. Always. Strikers will be vilified – ‘won’t somebody think of the [whoever is inconvenienced]’ – with no sense of awareness. The work that the strikers do isn’t important until they stop doing it, apparently. Often they’ll quote someone affected saying how annoyed they are. Again, this will be framed as the fault of the strikers rather than the failure of employers to manage their industrial relations in a competent manner. That question is never even asked, never mind answered. It’s a dance as old as time. Or at least as old as capitalism.

The Four Fights

But it’s not just about pensions. It’s about the ‘Four Fights‘ too.

address the scandal of the gender, ethnic, and disability pay gap

end contract casualisation and rising job insecurity

tackle the rising workloads driving our members to breaking point

increase to all spine points on the national pay scale of £2,500. [to make up for a 17.6% real terms pay cut between 2009-2019]

(UCU website, accessed 18th Oct 2021)

All laudable goals. Especially the pay gaps… it really ought not to be beyond the wit of folks of good will in university management and UCU to come up with an action plan to start to address this. Granted, we cannot solve the problems of discrimination and inequality in wider society, but we can do our bit, and it ought not to be that expensive. I don’t really understand why this is so hard to agree on.

The others, though. Pragmatically… how are they to be achieved? And can they be achieved without costing more? And if not, can we afford all of them, and how do we prioritise?

Let’s get a few red herrings out of the way first.

First, you might very reasonably be very cross about the above-average-inflation pay awards to some vice chancellors and some senior university staff. You might be one of those people who – consistently – thinks this is an issue across the whole economy. Or you may be one of those people who – inconsistently – thinks nothing of the worst excesses of the private sector’s snouts-in-troughery, but objects to anyone in public service being ‘paid more than the Prime Minister’. But… even if we cut executive pay by… let’s say 1/3… this will give us nowhere near enough money by itself to address any of our issues.

Second, you might form the view that there are two many ‘faceless’ managers and administrators. If so, I would invite you to (a) read this piece and reconsider; (b) reflect on the fact that ‘faceless’ just means you don’t know them or understand what they do; and (c)… we’re right here, folks. Striking alongside you. Those of us who can afford to.

“Down with this sort of thing!”

Low(er) cost solutions

Let’s consider what could be done quickly and relatively cheaply. How far can humane management/HR practices go in addressing casualisation and job insecurity? A fair bit, I’d imagine. We could be much better at giving researchers permanent or open-ended contracts. Even if the reality is that redundancy processes can and will still be used to terminate contracts where there’s a lack of funding. We can treat our fixed term staff better, and we can take our duty to support the development of our staff much more seriously. We should be setting them up for their next role, whether that’s with the same institution or elsewhere.

Demand for academic posts exceeds supply. This is a topic for another blogpost, because scarcity of opportunity and funding are wicked problems which drive a lot of what’s wrong with research culture. But we could do better about not ruthlessly exploiting that fact for fun and profit. To avoid a race to the bottom for competitive advantage, we need sector wide norms and standards. And as far as I understand it (and correct me if I’m wrong, which I often am) this is what’s being resisted. I don’t believe that we have the most humane management/HR practices, and that’s why I’m reluctantly supporting industrial action on this point.

Can we tackle rising workloads without spending a lot of money? Again, there are certainly some things we can do. I’ve been coordinating my university’s response to the Review of Research Bureaucracy, which has been an eye-opening experience. It’s only looked at externally imposed bureaucracy, and only research bureaucracy. It may be that the real issue is internally imposed research bureaucracy, teaching bureaucracy (especially), and, well, administrative bureaucracy. I’m sure there’s more that can be done, and some of that may involve employing more administrative and managerial support. My vision is of a university where academics do academia, and administrators and managers do administration and management. And we do leadership together.

We might expect universities to take a long, hard look at what’s expected of the average academic and review what expectations are reasonable. Too many institutions have grant income targets that are scarcely on a nodding acquaintance with reality. They appear not to understand that limited funding means that for some to succeed, others must fail. I know it’s fashionable to blame the REF for everything. But actually the last REF rules that moved away from demanding four publications per researcher opened the door to greater flexibility of roles and expectations.

But for all the talk of ‘be kind’ and yoga and wellness and mindfulness and whatever else, there’s still far too much unkind management and unrealistic expectations. Personally, I’m currently lucky to benefit from supportive, enlightened and empowering management (hello, if you’re reading), but I’ve also experienced the opposite and there’s far too much of it about. Whether sector-wide strike action is the way to address rising workloads I’m not sure. What could we do at a national, sector-wide level? What would that look like? I’m convinced of the importance of the issue, but less so for the case for national strike action as the mechanism to resolve it. But I’m open to persuasion.

Fantasy Head of School

But another possible response to rising workloads is… well… sorry… it’s casualisation and job insecurity.

Sandra Oh, in ‘The Chair’ (Netflix)

Let’s play Fantasy Head of School. Or University Management Simulator. Hypothetically, anyway. Pressures on your permanent staff too great? Use your limited resources to buy in as much extra teaching capacity as possible… which means sessional teachers and short term teaching fellowships or teaching-focused roles. Or… we treat those staff better, give them more professional development time, more scholarship time, and we get less teaching capacity for our buck. And increase workloads.

Look, if you don’t know who this is, just go and watch Community. Thank me later. It’s much funnier than ‘The Chair’.

That’s not the only issue – creating better bottom-rung-of-the-academic-ladder jobs – more hours, longer contracts – almost certainly means fewer such opportunities. Is that a good thing? I think, on balance, probably… but it’s not straightforward. My one remaining reader will no doubt be sounding the ‘false dichotomy’ klaxon at this point. Correctly so. We can, of course, find a compromise or a balance of sorts, but let’s not pretend it’s straightforward. We can’t have everything.

Do we employ more staff (to reduce workloads) on more secure contracts (to reduce insecurity)? Or do we address the real terms 17.6% pay cut by increasing all spine points by £2.5k? And – dare I say it – paying higher employers’ pension contributions, if that is indeed actually needed. What gets priority? You’ll forgive me if I would rather see any £££ for pay rises focused at the lower end of the pay scale rather than giving Profs another two and half grand. Whisper it quietly, but I’d rather spend it on the lowest paid university employees who tend to be represented by UNITE or UNISON rather than UCU. And a focus on the lowest paid/lower spine grades and spine points might also be a good way to start addressing pay gaps.

Pay costs and non pay costs

As Fantasy University Manager, could we hack away at non-pay costs? Conference funding? Seedcorn funding for new research ideas? Research facilities and infrastructure? The university’s core infrastructure and systems which – when working well – create efficiency savings and minimise friction? Estates and buildings? Student spaces? Lecture theatres? Seminar rooms? One of my constant frustrations as a Research Development Manager is working with brilliant colleagues with outstanding ideas who we can’t support with kit/seedcorn ££/infrastructure as they deserve.

I’ve read in a number of places that the percentage of average university income spent on staff costs has been in decline for some time. The best source I can find for this is this UCU article from 2017. I’m wary about trying to dive into HESA stats as I’m not competent to play with financial data without water wings and a lifeguard. If anyone has any better sources/more up to date info, please let me know via twitter, email, or in the comments. This decline may or may coincide with a long run of real term pay cuts, and that may be related. Or not. I’m also not sure what the percentage of staff costs for an organisation ought to look like… my instinct is that under 54.6% seems very, very low. But I’m not sure why I think that… some half-remembered presentation? Or a Business School research grant application? But if it is low, I don’t know why that might be, or what it might mean.

I’m not sure what I think about grand estates/infrastructure projects. Obviously some have gone very well, others very badly. Can we reduce investment on estates and infrastructure to spend more on staffing? There’s a balance to be struck. One option is that we say the balance has swung too far, and we cut back and spend more on staff. Another option is that we end everything but essential maintenance to spend more on staff, but that’s not sustainable in the long run. Unless we want dilapidated lecture theatres and ageing research kit, because if that happened we’d be the first to complain about a lack of investment.

Let’s assume for the sake of argument that the pendulum has swung too far, and that there is extra money at all or most institutions if it were to cut back or delay or cancel some estates and infrastructure projects. Even on top of whatever COVID-related cuts have been made. If there is that money available, how do we spend it? Because I’m not convinced that there’s enough of a saving there to cover everything that UCU is asking for.

There isn’t a magic money tree. Pragmatically speaking. The resource envelope is what it is. Unless anyone is willing to spend, spend, spend and dare the government to shut them down or bale them out. Perhaps £££ will be increased under a future government of a more progressive frame of mind willing to invest more in public and quasi-public services. But that won’t happen in the short, or perhaps even medium term. And when it does, I suspect that universities will be some way down the priority list. As Fantasy Head of School, you need to make decisions now.

I’m aware, of course, that UCU’s demands are a wish list, a negotiating position. It’s also a way of achieving a broad consensus among colleagues whose interests are not precisely aligned. If we look at the Four Fights and the Pensions situation purely selfishly, we’d not all have the same list of priorities.

But ultimately, we have a long list of demands. Some of which can be met or addressed without prohibitively expensive measures… but for others, if there is money available, we’ll need to prioritise. And that priorisiation is likely to be controversial and uncomfortable. And we can either engage with prioritisation, or we can leave it to university management.

I know which I’d rather do.

Reviewing Internal Peer Review of Grant Applications, Part 2: How to make it work better

A version of this article first appeared in Funding Insight in April 2019 and is reproduced with kind permission of Research Professional. For more articles like this, visit www.researchprofessional.com

We can leap higher with assistance than we can on our own. Picture: Darren England (AAP) via ABC News.

Most universities have internal peer review processes for grant applications. In part one I discussed the different purposes of internal peer review and how they can cause confusion. I also wrote about how to ensure that we present internal peer review as helpful and supportive rather than a hurdle to be overcome. In this second and final part, I’m going to look at how we might do internal peer review of grant applications better.

Who do we ask to review?

The ideal reviewer is a senior academic with a track record of success with major research funding applications and some insight into the subject area. Even at research-intensive institutions, there is a limited supply and their time is valuable. Especially for reviewers in development-related topics because of the volume of Global Challenges Research Fund (GCRF) bids. [Alas, this example from 2019 has dated poorly.] Our instinct is to ask senior Profs, but I wonder if a closer review by someone less senior could be more useful. We should think beyond the usual suspects, as reviewing can be a developmental exercise. My experience has been that researchers who are rarely asked to review often throw themselves into the task with a lot more enthusiasm. They’re often delighted to be asked, and keen to do a good job.

Do we make internal peer review anonymous?

This is tricky. In my view – ideally – no. Being able to put feedback in the context of the reviewer’s background can be very valuable. I also think that people should be willing to stand behind their comments.

However, because internal peer review can have a filtering role, perhaps the protection of anonymity is required for reviewers to be willing to say that proposals shouldn’t go forward. Or perhaps even to be willing to criticise colleagues’ work at all. However, I would expect that the rationale for soft filtering out an application should be one that most applicants would accept and understand. For a hard filter – when only x number of applications can go forward from the institution – there would usually be a committee decision bound by collective responsibility. I’m not aware of any research or internal survey work done on internal peer reviewers and their attitudes to anonymization, and I’d be interested to see if anyone has looked at this.

How do we ask reviewers to review?

It’s not obvious how to review a grant application. Those without much experience may be reluctant to trust their instincts or judgement because “it’s not really my area”. A small number go the other way and go power crazy at the chance to sit in judgement – judging the proposal from their own personal, partisan perspective and completely write off entire academic disciplines and sub-disciplines.

One option is to ask reviewers to use the same form that the funder in question gives to referees or panel members. It’s a great idea in principle, but academics typically have a loathe-hate relationship with forms. But there are some specific questions we could ask reviewers in a structured way, or use as prompts. Fewer questions will get better answers.

What isn’t clear? What’s confusing or ambiguous?
What are the potential weaknesses?
What’s missing?
How could the application be improved?

If I were to ask a single question, it would be the pre-mortem.

If I could see into the future and tell you now that this application is not going to be funded, what will be the main reason?

This question helps home in on key weaknesses – it might be fit-to-call, it might be unclear methodology; it might be weak impact pathways; it might be the composition of the research team. It’s a good question for applicants to ask themselves.

How do we feed back?

It’s not enough for feedback to be correct, it must be presented in a way that maximises the chances that the PI will listen.

Ideally, I’d like a face-to-face meeting involving the internal reviewers, the Research Development Manager, the PI and possibly the co-investigators. The meeting would be a discussion of a full draft in which reviewers can offer their views and advice and the PI can respond, and ask questions about their impressions of the proposal. I like face-to-face meetings because of the feedback multiplier effect – one reviewer makes an observation, which triggers another in the second reviewer. A PI response to a particular point triggers a further observation or suggestion. If approached in the right spirit (and if well-chaired) this should be a constructive and supportive meeting aimed at maximising the applicant’s chances of success. It must not be a Dragon’s Den style ordeal.

In reality, with packed diaries and short notice calls, it’s going to be difficult to arrange such meetings. So we often have to default to email, which needs a lot of care, as nuance of tone and meaning can be lost. I would advise that feedback is sent through an intermediary – another task for your friendly neighbourhood research development manager – who can think about how to pass it on. Whether to forward it verbatim, add context or comments, or smooth off some abrasive edges. I’ve had a reviewer email me to say that she’s really busy and could I repackage her comments for forwarding? Happy to.

A good approach is to depersonalise the applicant – address the feedback to the draft application, not its authors. (“The current draft could be clearer on….” versus “You could be clearer on…”). But I think depersonalising the reviewers and their comments is a mistake – impersonal, formal language can come over as officious, high handed, and passive aggressive. It will make applicants less likely to engage, even if the advice is solid. Using (even rhetorical) questions rather than blunt statements invites engagement and reflection, rather than passing final judgement.

Which would you respond to best?

The panel’s view is that your summary section is poor and is an introduction to the topic, not a proper summary of your whole project. You should rewrite before submitting.

Or….

Could the summary be strengthened? We thought the draft version read more like an introduction to the topic, and we think reviewers are looking for a summary of the complete proposal in a nutshell. Is there time to revisit this section so it better summarises the project as a whole?

Institutions invest time and money in having arrangements that provide prospective PIs with detailed feedback from senior academic colleagues to improve their chances of success. But it’s all for nothing if the resulting advice is ineffective because of the way the feedback is communicated, or the way the whole process is presented or perceived by researchers.

Reviewing Internal Review of Grant Applications (part 1): Helping or Hoop-jumping?

“The review panel is concerned that your methodology is under-specified”
(WF Yeames, ‘When did you last see your father?‘)

A version of this article first appeared in Funding Insight in April 2019 and is reproduced with kind permission of Research Professional. For more articles like this, visit www.researchprofessional.com

Most universities have internal peer review processes for research grant applications. In the first of two articles about internal peer review, I wonder whether what ought to be valuable support can be perceived as an obstacle. Part two looks at how we might run peer review more effectively.

Why do we have internal peer review?

Internal peer review of research grant applications has two distinct functions which can easily become blurred. I think this can cause misunderstandings.

The first function is as filter – to select which applications go forward and which do not. This has two variants. A ‘hard filter’ for a scheme or funder with formal limits on the number of applications that one institution can submit. Or a ‘soft filter’ where there are no formal limits on application numbers, but there’s a steer from the funder to submit only the most competitive applications. Another motivation for a soft filter is to save academic time by slowing, stopping, or redirecting uncompetitive applications.

The second function is to improve the quality of the application. The goal is to produce some actionable suggestions for improvements to increase the chance of success. In a previous article I explained how research development staff can bring a fresh perspective. Comments from a senior academic of comparable standing to the expert reviewers or funding panel members can be similarly helpful, but with the added benefit of academic expertise.

Both functions of peer review – of filtering and improving – are often rolled together into one process. Perhaps this causes confusion both for reviewers and the reviewed. I wonder if we over-emphasise the role of the filter at the expense of the improvement? Does fear of the filter reduce the efficacy of the suggestions for improvements?

Perceptions of internal peer review

When discussing internal peer review with academic colleagues, I’ve seen wildly different reactions. Some are very enthusiastic and are hungry for comments and feedback. Others are a bit more…. Gollum and don’t want anyone to gaze upon their precious. Most are somewhere in the middle… welcoming of genuinely useful comments and insights, but wary about being forced to make changes against their better judgement or being prevented from applying at all.

There’s no denying that ‘filter’ role exists and it would be mistake to do so. I reassure academics that in my experience it’s very rare for a bid to be soft-filtered out because of internal reviewer’s comments and for the applicant to disagree with the rationale. Usually the reviewer has spotted something that the applicant missed, either related to the call, the application or underpinning idea. Perhaps it needs another co-I, or needs stronger stakeholder engagement, needs to engage with a particular body of literature, or just needs a lot more time to develop. Or the issue is the fit to funder or call.

Research development staff send out details of calls with internal timetables and internal deadlines for the various review stages. But are potential applicants seeing peer review (and associated deadlines) as a developmental process put in place to support them and to help them succeed? Or do they see peer review as a barrier to be overcome (or even evaded), placed in their path by over-officious Heads of Research and research managers that seek to micromanage, police and restrict?

I sometimes worry that in our desire to set out processes to try to prevent and pre-empt disruptive last-minute applications and set out an orderly and timely process, we end up sending the wrong message about peer review and about the broader support available. If we’re dictating terms and timetables for peer review, do we make it look as if grant applicants must fit around reviewer (and research support) requirements and timescales? And is that the right way around?

To be clear, I’m certainly not arguing against having a structured process with indicative milestones with some level of enforcement. Unplanned last minute applications are disruptive and stressful, forcing people to drop everything to provide support with no notice. Worst of all, the applications that result usually aren’t very good and rushed applications are seldom competitive. We absolutely should try to save people from this kind of folly.

And… of course, we need to allow time for senior (and therefore busy) academics to undertake internal peer review. I suspect that most institutions rely on a relatively small pool of reviewers who are asked to read and comment on multiple applications per year, and that few get any formal workload allocation. While we should certainly give applicants plenty of time to write their applications, we need to treat our reviewers with consideration and value their time.

Positive about internal peer review

I’m not arguing that we disguise or minimise the ‘filter’ element of internal peer review in favour of an unqualified upbeat presentation of internal peer review being entirely about improving the quality of the application. But perhaps we could look at ways to present internal peer review in a more positive, supportive, developmental – and less officious – light.

The most important part of peer review positivity – and the subject of the second part of this series – is in how internal peer review happens in practice: who reviews, how and when; and how and in what spirit reviewer comments are communicated to applicants. If internal peer review as a process helps strengthen applications, word will get round and support and buy-in will grow – one positive experience at a time.

But even before that stage, I think it’s worth thinking about how we communicate our internal peer review processes and timetables. Could we be more positive in our framing and communication? Could we present internal peer review more as a helping hand to climb higher, and less as a hurdle to overcome?

An applicant’s guide to Full Economic Costing

A version of this article first appeared in Funding Insight in July 2019 and is reproduced with kind permission of Research Professional. For more articles like this, visit www.researchprofessional.com

You’re applying for UK research council funding and suddenly you’re confronted with massive overhead costs. Adam Golberg tries to explain what you need to know.

Trying to explain Full Economic Costing is not straightforward. For current purposes, I’ll be assuming that you’re an academic applying for UK Research Council funding; that you want to know enough to understand your budget; and that you don’t really want to know much more than that.

If you do already know a lot about costing or research finances, be warned – this article contains simplifications, generalisations, and omissions, and you may not like it.

What are Full Economic Costs, and why are they taking up so much of my budget?

Full Economic Costs (fEC) are paid as part of UK Research and Innovation grants to cover a fair share of the wider costs of running the university – the infrastructure that supports your research. There are a few different cost categories, but you don’t need to worry about the distinctions.

Every UK university calculates its own overhead rates using a common methodology. I’m not going to try to explain how this works, because (a) I don’t know; and (b) you don’t need to know. Most other research funders (charities, EU funders, industry) do not pay fEC for most of their schemes. However, qualifying peer-reviewed charity funding does attract a hidden overhead of around 19% through QR funding (the same source as REF funding). But it’s so well hidden that a lot of people don’t know about it. And that’s not important right now.

How does fEC work?

In effect, this methodology produces a flat daily overhead rate to be charged relative to academic time on your project. This rate is the same for the time of the most senior professor and the earliest of early career researchers.

One effect of this is to make postdoc researchers seem proportionally more expensive. Senior academics are more expensive because of higher employment costs (salary etc), but the overheads generated by both will be the same. Don’t be surprised if the overheads generated by a full time researcher are greater than her employment costs.

All fEC costs are calculated at today’s rates. Inflation and increments will be added later to the final award value.

Do we have to charge fEC overheads?

Yes. This is a methodology that all universities use to make sure that research is funded properly, and there are good arguments for not undercutting each other. Rest assured that everyone – including your competitors– are playing by the same rules and end up with broadly comparable rates. Reviewers are not going to be shocked by your overhead costs compared to rival bids. Your university is not shooting itself (or you) in the foot.

There are fairness reasons not to waive overheads. The point of Research Councils is to fund the best individual research proposals regardless of the university they come from, while the REF (through QR) funds for broad, sustained research excellence based on historical performance. If we start waiving overheads, wealthier universities will have an unfair advantage as they can waive while others drown.

Further, the budget allocations set by funders are decided with fEC overheads in mind. They’re expecting overhead costs. If your project is too expensive for the call, the problem is with your proposal, not with overheads. Either it contains activities that shouldn’t be there, or there’s a problem with the scope and scale of what you propose.

However, there are (major) funding calls where “evidence of institutional commitment” is expected. This could include a waiver of some overheads, but more likely it will be contributions in kind – some free academic staff time, a PhD studentship, new facilities, a separate funding stream for related work. Different universities have different policies on co-funding and it probably won’t hurt to ask. But ask early (because approval is likely to be complex) and have an idea of what you want.

What’s this 80% business?

This is where things get unnecessarily complicated. Costs are calculated at 100% fEC but paid by the research councils at 80%. This leaves the remaining 20% of costs to be covered by the university. Fortunately, there’s enough money from overheads to cover the missing 20% of direct costs. However, if you have a lot of non-pay costs and relatively little academic staff time, check with your costings team that the project is still affordable.

Why 80%? In around 2005 it was deemed ‘affordable’ – a compromise figure intended to make a significant contribution to university costs but without breaking the bank. Again, you don’t need to worry about any of this.

Can I game the fEC system, and if so, how?

Academic time is what drives overheads, so reducing academic time reduces overheads. One way to do this is to think about whether you really need as much researcher time on the project. If you really need to save money, could contracts finish earlier or start later in the project?

Note that non-academic time (project administrators, managers, technicians) does not attract overheads, and so are good value for money under this system. If some of the tasks you’d like your research associate to do are project management/administration tasks, your budget will go further if you cost in administrative time instead.

However, if your final application has unrealistically low amounts of academic time and/or costs in administrators to do researcher roles, the panel will conclude that either (a) you don’t understand the resource implications of your own proposal; or (b) a lack of resources means the project risks being unable to achieve its stated aims. Either way, it won’t be funded. Funding panels are especially alert for ‘salami projects’ which include lots of individual co-investigators for thin slivers of time in which the programme of research cannot possibly be completed. Or for undercooked projects which put too much of a burden on not enough postdoc researcher time. As mentioned earlier, if the project is too big for the call budget, the problem is with your project.

The best way to game fEC it is not to worry about it. If you have support with your research costings, you’ll be working with someone who can cost your application and advise you on where and how it can be tweaked and what costs are eligible. That’s their job – leave it to them, trust what they tell you, and use the time saved to write the rest of the application.

Thanks to Nathaniel Golden (Nottingham Trent) and Jonathan Hollands (University of Nottingham) for invaluable comments on earlier versions of this article. Any errors that remain are my own.

Research Development – supporting new academic disciplines

I’ve recently moved from a role supporting the Business School and the School of Economics to a central role at the University of Nottingham, looking after our engagement with research charities. I’m going from a role where I know a few corners of the university very well to a role where I’m going to have to get to know more about much more of it.

“Don’t panic!”

My academic background (such as it is) is in political philosophy and for most of my research development career I’ve been supporting (broadly) social sciences, with a few outliers. I’m now trying to develop my understanding of academic disciplines that I have little background or experience in – medical research, life sciences, physics, biochemistry etc. I suspect the answer is just time, practice, familiarity, confidence (and Wikipedia), but I found myself wondering if there are any short cuts or particularly good resources to speed things up.

Fortunately, if you’re a member of ARMA, you’re never on your own, and I sent an email around the Research Development Special Interest Group email list, with a promise (a) to write up contributions as a blog post and (b) to add some hints and tips of my own, especially for the social sciences.

So here goes… the collated and collected wisdom of the SIG… bookmark this post and revisit it if your remit changes…

Don’t panic… and focus on what you can do

In my original email, the first requirement I suggested was ‘time’, and that’s been echoed in a lot of the responses. “Time, practice, familiarity, confidence (and Wikipedia)” as Chris Hewson puts it. It’s easy to be overwhelmed by a sea of new faces and names and an alphabet soup of new acronyms- and to regard other people’s hard-won institutional/school/faculty knowledge as some kind of magical superpower.

Lorna Wilson suggests that disciplinary differences are overrated and “sometimes the narrative of ‘difference’ is what makes things harder. The skills and expertise we have as research development professionals are transferable across the board, and I think that the silos of disciplines led to a silo-ing of roles (especially in larger universities). With the changes in the external landscape and push with more challenge-led interdisciplinary projects, the silos of disciplines AND of roles I think is eroding.”

But there are differences in practices and norms – there are differences in terminology, outlook, career structures, internal politics, norms, and budget sizes – and I’m working hard trying not to carry social science assumptions with me. Though perhaps I’m equally likely to be too hesitant to generalise from social science experience where it would be entirely appropriate to do so.

Rommany Jenkins has “moved from Arts and Humanities to Life Sciences” and thinks that while “the perception might be that it’s the harder direction to go in because of the complexity of the subject matter […] it’s probably easier because the culture is quite straightforward […] although there are differences between translational / clinical and basic, the principles of the PI lab and team are basically the same”. She thinks that perhaps “it’s more of a culture shock moving into Arts and Humanities, because people are all so independently minded and come at things from so many different directions and don’t fit neatly into the funding boxes. […] I know a lot of people just find it totally bizarre that you can ask a Prof in Arts what they need in terms of costings and they genuinely don’t know.”

Charlotte Johnson moved in the opposite direction, from science to arts. “The shortcut was trying to find commonalities in how the different disciplines think and prepare their research.  Once you realise that an artist and a chemist would go about planning their research project very similarly, and they only start to diverge in the experimental/interpretation stage, it does actually make it all quite easy to understand“

Muriel Swijghuisen Reigersberg says that her contribution “tends to be not so much on the science front, but on the social and economic or policy and political implications of the work STEMM colleagues are doing and recommendations around impact and engagement or even interdisciplinary angles to enquiries for larger projects.”

My colleague Liz Humphreys makes a similar (and very reassuring) point about using the same “skills to assess any bid by not focusing on the technical things but focus on all the other usual things that a bid writer can strengthen”. A lay summary that doesn’t make any lay sense is an issue regardless of discipline, as is a summary that doesn’t summarise that’s more of an introduction. Getting good at reviewing research grants can transcend academic disciplines. “If someone can’t explain to me what they’re doing,” says Claire Edwards, “then it’s unlikely to convince reviewers or a panel.”

Kate Clift make a similar point: “When I am working in a discipline which is alien to me I tend to try and ground the proposed research in something which I do understand so I can appreciate the bigger picture, context etc. I will ask lots of ‘W’ questions – Why is it important? What do you want to do? Who is going to do it? Less illuminating to me in this situations is HOW they are going to do it”.

Roger Singleton Escofet makes the very sensible point that some subjects are very theoretical “where you will always struggle to understand what is being proposed”. I certainly found this with Economics – I could hope to try to understand what a proposed project did, but how it worked would always be beyond me. Reminds me a bit of this Armstrong and Miller sketch in which they demonstrate how not to do public engagement in theoretical physics.

Ann Onymous-Contributor says that “multidisciplinary projects are the best way to ease yourself into other disciplines and their own specific languages.  My background is in social sciences but because of the projects I have worked on I have experience of, and familiarity with a range of arts and hard science disciplines and the languages they use.  Broad, shallow knowledge accumulated on this basis can be very useful; sometimes specific disciplinary knowledge is less important than understanding connections between different disciplines, or the application of knowledge, which typically also tend to be the things which specialists miss.”  I think this is a really good point – if we allow ourselves it include the other disciplines that we’ve supported as part of interdisciplinary bids, we may find we’ve more experience that we thought.

Finding the Shallow End, Producing your Cheat Sheet

Lorna Wilson suggests “[h]aving a basic understanding” of methodologies in different disciplines, “helps to demonstrate how [research questions] are answered and hypotheses evidenced, and I think breaks through some of the ‘difference’. What makes things slightly more difficult is also accessibility, in terms of language of disciplines, we could almost do with a cheat sheet in terms of terms!”

Richard Smith suggests identifying academics in the field who are effective and willing communicators “who appreciate the benefits and know the means of conveying approaches and fields to non-experts… and do it with enthusiasm”. Harry Moriarty’s experience has been that often ECRs and PhD students are a particularly good source – many are more willing to engage, and perhaps have more to benefit from our advice and support.

Muriel Swijghuisen Reigersberg suggests attending public lectures (rather than expert seminars) which will be aimed at the generalist, and notes that expert-novice conversations will benefit the academic expert in terms of practising explanations of complex topics to a generalist audience. I think we can all recognise academics who enjoy talking about their work to non-specialists and with a gift for explanations, and those who don’t, haven’t or both.

Other non-academic colleagues can help too, Richard argues – especially impact and public or business engagement staff working in that area, but also admin staff and School managers. Sanja Vlaisavljevic wanted to “understand how our various departments operate, not just in terms of subject-matter but the internal politics”. This is surely right – I’m sure we’re all aware of historical disagreements or clashes between powerful individuals or whole research groups/Schools that stand in the way of certain kinds of collaboration or joint working. Whether we work to try to erode these obstructions or navigate deftly around them, we need to know that they’re there.

Caroline Moss-Gibbons adds librarians to the list, citing their resource guides and access/role with the university repository. Claire Edwards observes that many research development staff have particular academic backgrounds that might be useful.

Don’t try to fake it till you make it

“Be open that you’re new to the area, but if they’re looking for funding they need to be able to explain their research to a non-specialist” says Jeremy Barraud.

I’ve always found that a full, frank, and even cheerful confession of a lack of knowledge is very effective. I often include a blank slide in presentations to illustrate what I don’t know. My experience is that admitting what I don’t know earns me a better hearing on matters that I do know about (as long as I do both together), but I’m aware that as a straight, white, middle aged, middle class male perhaps that’s easier for me to do. I’ve suspected for some time now that being male (and therefore less likely to be mistaken for an “administrator”) means I’m probably playing research development on easy mode. There’s an interesting project around EDI and research development that I’m probably not best placed to do.

While no-one is arguing for outright deception, I’ve heard it argued that frank admissions of ignorance about a particular topic area may make it harder to engage academic colleagues and to find out more. If academic colleagues make certain assumptions about background, perhaps try to live up to those with a bit of background reading. It’s easy to be written off and written out, which then makes it harder to learn later.

I always think half the battle is convincing academic colleagues that we’re on their side and the side of their research (rather than, say, motivated by university income targets or an easier life), and perhaps it’s easy to underestimate the importance of showing an interest and a willingness to learn. Asking intelligent, informed, interested lay questions of an expert – alongside demonstrating our own expertise in grant writing etc – is one way to build relationships. My own experience with my MPhil is that research can be a lonely business, and so an outsider showing interest and enthusiasm – rather than their eyes glazing over and disengaging – can be really heartening.

Kate Clift makes an important point about combining any admissions of relative ignorance with a stress on what she can do/does know/can contribute. “I’m always very upfront with people and say I don’t have an understanding of their research but I do understand how to craft a submission – that way everyone plays to their strengths. I can focus on structure and language and the academic can focus on scientific content.”

Find a niche, get involved, be visible

For Jeremy Barraud, that was being secretary for an ethics committee. In my early days with Economics, it was supporting the production of the newsletter and writing research summaries – even though it wasn’t technically part of my remit, it was a great way to get my name known, get to know people, and have a go at summarising Economics working papers.

Suzannah Laver is a research development manager in a Medical School, but has a background in project management and strategy rather than medicine or science. For her it was “just time” and getting involved “[a]ttending the PI meetings, away days, seminars, and arranging pitching events or networking events.” Mary Caspillo-Brewer adds project inception meetings and dissemination events to the list, and also suggests attending academic seminars and technical meetings (as does Roger Singleton Escofet), even if they’re aimed at academics. This is great in terms of visibility and in terms of evidence of commitment – sending a message that we’re interested and committed, even if we don’t always entirely understand.

Mark Smith suggests visiting research labs or clinics, however terrifying they may first appear. So far I’ve only met academics in their offices – I’m not sure I trust myself anywhere near a lab. I’m still half-convinced I’ll knock over the wrong rack of test tubes and trigger a zombie epidemic. But lab visits are perhaps something I could do more of in the future when I know people better. And as Mark says, taking an interest is key.

Do your homework

I’ve blogged before about the problems with the uses and abuses of successful applications, but Nat Golden is definitely onto something when he suggests reading successful applications to look at good practice and what the particular requirements of a funder are. Oh, and reading the guidance notes.

Roger Singleton Escofet (and others) have mentioned that the Royal Society and Royal Academy of Engineering produce useful reports that “may be technical but offer good overviews on topical issues across disciplines. Funders such as research councils or Wellcome may also be useful sources since funders tend to follow (or set) the emerging areas.” Hilary Noone also suggests looking to the funders for guidance – trying to “understand the funders real meaning (crucial for new programmes and calls where they themselves are not clear on what they are trying to achieve)”.

There’s a series of short ‘Bluffer’s Guide’ books which are somewhat dated, but potentially very useful. Bluff your way in Philosophy was on my undergraduate reading list. Bluff your way in Economics gave me an excellent grounding when my role changed, and explained (among many other things) the difference between exogenous and endogenous factors. When supporting a Geography application, I learned the difference between pluvial and fluvial flooding. These little things make a difference, and it’s probably the absence of that kind of basic ground for many disciplines that I’m now supporting that’s making me feel uneasy. In a good way.

Harry Moriarty argues that it’s more complicated than just reading Wikipedia – the work he supported “was necessarily at the cutting edge and considerably beyond the level that I could get to in a sensible order – I had to take the work and climb back through the Wikipedia pages in layers, and then, once I had some underpinning knowledge, go back through the same pages in light of my new understanding”.

Specific things to do

“Become an NIHR Public Reviewer”, says Jeremy Barraud. “It’s easy to sign up and they’re keen to get more reviewers. Being on the other side of the funding fence gives a real insight into how decisions are reached (and bolsters your professional reputation when speaking with researchers). “

I absolutely second this – I’ve been reviewing for NIHR for some time and just finished a four year term as a patient/public representative on a RfPB panel. I’d recommend doing this not just to gain experience of new research areas, but as a valuable public service that you as a research development professional can perform. If you’ve got experience of a health condition, using NHS services (as a patient or carer), and you’re not a healthcare professional or researcher, I’m sure they’d love to hear from you.

Being a research participant, argues Jeremy Barraud, is “professionally insightful and personally fulfilling. The more experience you have on research in all its different angles, the better your professional standing”. This is also something I’ve done – in many ways it’s hard not to get involved in research if you’re hanging around a university. I’m part of a study looking at running and knee problems, and I’ve recently been invited to participate in another study.

Bonhi Bhattacharya registered for a MOOC (Massively Open Online Courses) – an “Introduction to Ecology” – Bonhi is a mathematician by training – “and it was immensely helpful in getting a grounding in the subject, as well as a useful primer in terminology.“ It can be a bit of a time commitment, but they’re also fascinating – and as above, really shows willing. I wrote about my experience with a MOOC on behavioural economics in a post a few years ago. Bonhi also suggests reading academics’ papers – even if only the introduction and conclusion.

Resources

Subscribe to The Conversation, says Claire Edwards, it’s “a great source of academic content aimed at a non-specialist audience”. In a similar vein, Helen Walker recommends the Wellcome-funded website Mosaic which is “great for stories that give the bigger picture ‘around’ science/research – sometimes research journeys, sometimes stories showing the broader context of science-related research.” Both Mosaic and The Conversation have podcast companions. Recent Conversation podcast series have looked at the Indian elections and moon exploration.

I’m a huge fan of podcasts, and there are loads that can help with gaining a basic understanding of new academic areas – in addition to being interesting (and sometimes amusing).

A quick search of the BBC has identified four science podcasts I should think about listening to – The Science Hour, Discovery, and BBC Inside Science. Very open to other suggestions – please tweet me or let me know in the comments/via email.

A huge thank you to all contributors:

I’m very grateful to everyone for their comments. I’ve not been able to include everything everyone said, in the interests of avoiding duplication/repetition and in the interests of keeping this post to a manageable length.

I don’t think there’s any great secret to success in supporting a new discipline or working in research development in a new institution – it’s really a case of remembering and repeating the steps that worked last time. And hopefully this blog post will serve as a reminder to others, as it is doing to me.

  • Jeremy Barraud is Deputy Director, Research Management and Administration, at the University of the Arts, London.
  • Bonhi Bhattacharya is Research Development Manager at the University of Reading
  • Mary Caspillo-Brewer is Research Coordinator at the Institute for Global Health, University College London
  • Kate Clift is Research Development Manager at Loughborough University
  • Anne Onymous-Contributor is something or other at the University of Redacted
  • Claire Edwards is Research Bid Development Manager at the University of Surrey.
  • Adam Forristal Golberg is Research Development Manager (Charities), at the University of Nottingham
  • Nathanial Golden is Research Development Manager (ADHSS) at Nottingham Trent University
  • Chris Hewson is Social Science Research Impact Manager at the University of York
  • Liz Humphreys is Research Development Manager for Life Sciences, University of Nottingham
  • Rommany Jenkins is Research Development Manager for Medical and Dental Sciences, University of Birmingham.
  • Charlotte Johnson is Senior Research Development Manager, University of Reading
  • Suzannah Laver is Research Development Manager at the University of Exeter Medical School
  • Harry Moriarty is Research Accelerator Project Manager at the University of Nottingham.
  • Caroline Moss-Gibbons is Parasol Librarian at the University of Gibraltar.
  • Hilary Noone is Project Officer (REF Environment and NUCoREs0, at the University of Newcastle
  • Roger Singleton Escofet is Research Strategy and Development Manager for the Faculty of Science,  University of Warwick.
  • Mark Smith is Programme Manager – The Bloomsbury SET, at the Royal Veterinary College
  • Richard Smith is Research and Innovation Funding Manager, Faculty of Arts, Humanities and Social sciences, Anglia Ruskin University.
  • Muriel Swijghuisen Reigersberg is Researcher Development Manager (Strategy) at the University of Sydney.
  • Sanja Vlaisavljevic is Enterprise Officer at Goldsmiths, University of London
  • Helen Walker is Research and Innovation Officer at the University of Portsmouth
  • Lorna Wilson is Head of Research Development, Durham University

Setting Grant Getting Targets in the Social Sciences

I’m writing this in the final week of my current role as Research Development Manager (Social Sciences) at the University of Nottingham before I move to my role as Research Development Manager (Research Charities) at the University of Nottingham. This may or may not change the focus of this blog, but I won’t abandon the social sciences entirely – not least because I’m stuck with the web address.

Image by Tookapic

I’ve been thinking about strategies and approaches to research funding, and the place and prioritisation of applying for research grants in academic structures. It’s good for institutions to be ambitious in terms of their grant getting activities. However, these ambitions need to be at least on a nodding acquaintance with:
(a) the actual amount of research funding historically available to any given particular discipline; and
(b) the chances of any given unit or school or individual to compete successfully for that funding given the strength of the competition.

To use a football analogy, if I want my team to get promotion, I should moderate my expectations in the light of how many promotion places are available, and how strong the likely competition for those limited spots will be. In both cases, we want to set targets that are challenging, stretching, and ambitious, but which are also realistic and informed by the evidence.

How do we do that? Well, in a social science context, a good place to start is the ESRC success rates, and other disciplines could do worse than take a similar approach with their most relevant funding council. The ESRC produce quite a lot of data and analysis on funding and success rates, and Alex Hulkes of the ESRC Insights team writes semi-regular blog posts. Given the effort put into creating and curating this information, it seems only right that we use it to inform our strategies. This level of transparency is a huge (and very welcome) change from previous practices of very limited information being rather hidden away. Obvious caveats – the ESRC is by no means the only funder in town for the social sciences, but they’re got the deepest pockets and offer the best financial terms. Another (and probably better) way would be to compare HESA research income stats, but let’s stick to the ESRC for now.

The table below shows the running three year total (2015/6- 2017/18) and number of applications for each discipline for all calls, and the total for the period 2011/12 to 2017/8. You can access the data for yourself on the ESRC web page. This data is linked as ‘Application and success rate data (2011-12 to 2017-18)’ and was published in ODS format in May 2018. For ease of reading I’ve hidden the results from individual years.

Lots of caveats here. Unsuccessful outline proposals aren’t included (as no outline application leads directly to funding), but ‘office rejects’ (often for eligibility reasons) are. The ‘core discipline’ of each application is taken into account – secondary disciplines are not. The latest figures here are from 2017-2018 (financial year), so there’s a bit of a lag – in particular, the influence of the Global Challenges Research Fund (GCRF) or Industrial Strategy Challenge Fund (ISCF) will not be fully reflected in these figures. I think the ‘all data’ figures may include now-defunct schemes such as the ESRC Seminar Series, though I think Small Grants had largely gone by the start of the period covered by these figures.

Perhaps most importantly, because these are the results for all schemes, they include targeted calls which will rarely open to all disciplines equally. Fortunately, the ESRC also publishes similar figures for their open call (Standard) Research Grants scheme for the same time period. Note that (as far as I can tell) the data above includes the data below, just as the ‘all data’ column (which goes back to 2011/2) also includes the three year total.

This table is important because the Research Grants Scheme is bottom-up, open-call, and open to any application that’s at least 50% social sciences. Any social science researcher could apply to this scheme, whereas directed calls will inevitably appeal only to a subset. These are the chances/success rates for those whose work does not fit squarely into a directed scheme and could arguably be regarded as a more accurate measure of disciplinary success rates. It’s worth noting that a specific call that’s very friendly to a particular discipline is likely to boost the successes but may decrease the disciplinary success rate if it attracts a lot of bids. It’s also possible that major targetted calls that are friendly to a particular disciplin may result in fewer bids to open call.

To be fair, there are a few other regular ESRC schemes that are similarly open and should arguably be included if we wanted to look at the balance of disciplines and what a discipline target might look like. The New Investigator Scheme is open in terms of academic discipline, if not in time-since-PhD, and the Open Research Area call is open in terms of discipline if not in terms of collaborators. The Secondary Data Analysis Initiative is similarly open in terms of discipline, if not in terms of methods. Either way, we don’t have (or I can’t find) data which combines those schemes into a non-directed total.

Nevertheless, caveats and qualifications aside, I think these two tables give us a good sense of the size of prize available for each discipline. There’s approxinately 29 per year (of which 5 open call) for Economics, and 11 per year (of which 2 open call) for Business and Management. Armed with that information and a knowledge of the relative strength of the discipline/school in our own institution, we ought to get a sense of what a realistic target might look like and a sense of how well we’re already doing. Given what we know about our expertise, eminence, and environment, and the figures for funded projects, what ought our share of those projects be?

We could ask a further question about how those successes are distributed between universities and about any correllation between successes and (unofficial) subject league tables from the last REF, calculated on the basis of Grade Point Average or Research power. However, even if that data were available, we’d be looking at small numbers. We do know that the ESRC have done a lot of work on looking at funding distribution and concentration and their key findings are that:

ESRC peer review processes do not concentrate funding to a degree greater than that apparent in the proposals that request the funding.

ROs which apply infrequently appear to have lower success rates than do those which are more active applicants

In other words, most universities typically have comparable succcess rates except that those that apply more often do a little better than average, those who apply rarely do a little worse. This sounds intuitively right – those who apply more are likely more research-active, at least in the social sciences, and therefore more likely to generate stronger applications. But this is at an overall level, not discipline level.

I’d also note that we shouldn’t only measure success by the number of projects we lead. As grants get larger on average, there’s more research income available for co-investigators on bids leds elsewhere. I think a strategy that focuses only on leading bids and being lead institution neglects the opportunties offered by being involved in strong bids led by world class researchers based elsewhere. I’m sure it’s not unusual for co-I research income to exceed PI income for academic units.

I’ve not made any comment about the different success rates for different disciplines. I’ve written about this already for many of the years covered by the full data (though Alex Hulkes has done this far more effectively over the last few years, having the benefit of actual data skills) and I don’t really want to cover old ground again. The same disparities continue much as before. Perhaps GCRF will provide a much-needed boost for Education research (or at least the international aspects) and ISCF for management and business research.

Maybe.