So… this is a quick post because I’m pushed for time, but if you’ve not heard about #ResearchFishGate, then’s here’s a quick primer from Research Professional’s Sophie Inge.
Short version… academics have been complaining on social media about having to make their annual returns on their funded projects. In the best of all possible worlds, with the best possible system for collecting such information, academics would still complain about having to do it. Academics always complain about admin. However, I don’t think that accounting for how you’ve used public (or charity) money is itself unreasonable.
However, I think the bulk of the complaints in this case have been less about having to do it at all, but about the software/platform that’s used to do it and whether this information is every actually used. I’ve not been involved in supporting ResearchFish returns for some time now, but my impression is that the platform has improved. But clearly not as fast as some people would like.
ResearchFish have – for some time – been, er, trawling twitter for mentions and have been responding to any criticism with a fairly standard form of words.
We understand that you’re not keen on reporting on your funding through Researchfish but this seems quite harsh and inappropriate. We have shared our concerns with your funder.
There have since been a number of apologies and attempted apologies, UKRI and other funders have weighted in, and it’s been a bit of a mess. At the time of writing it remains unclear whether concerns were shared with the funder in question, though there are stories on twitter of academics being ordered in front of HR/Heads of School to explain themselves. So something has been going on.
Responding to legitimate criticism with threats to report critics to funders has gone down very poorly indeed. Researchers have questioned the GDPR implications… how does ResearchFish know which funder to “share their concerns” with? Is it misusing data?
Anyway, never mind all that
I’m less interested in the specifics of #ResearchFishGate and more interested in the broader issues raised about social media use. I’m sure I’m not the only one who saw the tweet and had a moment of alarm… who or what have I criticised? Have I gone too far? Is anyone going to share their concerns with my funder?
I have permission, approval and (occasionally) encouragement for my social media activities. (It helps when your meta-meta-meta-boss is Registrarism). With the proviso that I don’t “start slagging off funders on twitter”. And I have been a good boy.
However… you don’t get very far on Twitter if you’re in corporate drone mode. I wrote something about social media personas in 2014 (2014!), in which I argued for “smart casual” as a sensible Twitter approach. Adam-at-work, if you will. By showing my human side, I build relationships. If I didn’t, I wouldn’t. And if I didn’t build networks and relationships, then what’s the point?
One key point from #ResearchFishgate is that few/none of the critiques actively @-ed ResearchFish onto the discussion. They were talking about ResearchFish, not to ResearchFish. This is a really important point. However….
Of course ResearchFish has a Twitter search column for mentions of their name. I have one for links to my blog so I can find out if anyone’s tweeting about it (spoiler: they’re usually not). I have one for an LSE Impact Blog article I wrote in which I definitely don’t slag off funders, and occasionally I’ll set them up for Research Professional articles I’ve written. And I’m just a vain blogger who craves validation, not a corporate behemoth.
So anyone who tweets about ‘ResearchFish’ or any other funder or ecosystem platform or player, even without @-ing them in is being naive in thinking that they won’t see it. Perhaps even if you disguise the name to evade searches.. they might have that search set up too. Replacing all the vowels with “*” in the style of some newspapers and swearing isn’t that original.
The traditional social media advice was always that it’s public and permanent… don’t tweet anything you wouldn’t want everyone else to see…. who knows what will go viral (possibly wildly out of context)? Of the comments I’ve seen, some do include industrial language, but if there have been any that are abusive of individuals or even @-ing ResearchFish in, I’ve not seen them.
I’m sure it’s not nice to read that people don’t like your product… especially if it’s something you’ve worked very hard on trying to improve… none of us like criticism when we’re doing our best. We especially don’t like it if we can’t use it to improve in any way… but the answer is to grow a thicker skin and ignore it. It would be entirely sensible to use twitter as sentiment analysis, and to look for feedback – especially if there are concerns or issues that can be addressed instantly with user guide advice, or which can be fed back to the Devs. That’s okay.
It’s not okay to trawl twitter for mentions and then issue threats. It might be okay… just about… to make a polite enquiry in response to criticism and ask how the product could be improved. But it’s still barging – uninvited – into someone else’s conversation, even if it’s conversation that’s in the public square.
And I think that’s something that has changed during the pandemic. The tweets that drew the ire of the Fish of Research are the kind of thing that would – in the before-times – probably have been said around the metaphorical water cooler. Only we’re not there any more so often… we’re working from home, or our colleagues are. We have our Teams chats, but that’s generally work stuff, or work-flavoured. But Twitter’s right there, it’s a different and broader social circle. We’re all feeling more alone, more atomised… so those of us on Twitter are perhaps leaning to it more for conversation, companionship, interaction, and validation than before.
I’ve complained about an issue that… in hindsight… I probably shouldn’t have done, as it’s an internal University of Nottingham issue. But I learned that it’s a problem elsewhere too, that people agreed it was a problem and I heard some extra-egregious examples of the kind of thing I complained about. So I don’t regret doing it. I have raised it with my colleagues, but I think they’ve had enough of me moaning about it. Also… what can they really say? We’re in agreement about it.
Are there any? I guess so. A few lessons.
(1) Big Brother is watching you. Criticise any product and organisation on Twitter – even without @-ing them in – and you should assume that they’ll see it. None of the old advice has changed about social media use and who might see it. Indications are that employers are getting more stringent/intrusive about this.
(2) The default assumption for any organisation (or public figure) being criticised is that they’re talking about you, not to you. Without an @, it’s a private conversation, and you should think very carefully before intruding. And then you probably shouldn’t unless you think your intervention might be welcome.
(3) In spite 0f (1), I do think that the pandemic/wider social media use means that there should be greater allowances for social media use. A conversation can both be in the public square and be a private conversation, with at least some allowances for language and tone. Perhaps X wouldn’t have criticised ResearchFish in precisely those terms and with precisely that language if X knew they were eavesdropping, but the overall sentiment would be the same. That’s not to say that there aren’t still lines that shouldn’t be crossed… just that perhaps the tolerance band should be broader than before.
Pension planning seems like a technocratic problem that ought to have a technocratic solution. Or, more properly, a range of technocratic solutions to chose from, depending on our priorities and preference. Which, again, we can talk about.
Is there is a genuine problem with the pension scheme that’s not been resolved by all of the many, many previous pension cuts we’ve had since I signed up to USS about twenty years ago? Each time we were promised that this cut would resolve the genuine problems with our pension scheme. Each time it hasn’t. If there is still a problem, this ought to be understandable and communicable. And something that can be negotiated about, around, and through.
ButUUK/university management has made this impossible through failures of transparency, dubious consultations, and a level of spin that borders on the Trumpian. It’s all massively counterproductive – we’re not stupid, so stop treating us as if we are. Those minded to at least entertain the thought that there’s an issue with our pension scheme don’t trust UUK, because they have acted – and continue to act – in bad faith.
My suspicion is that they’ll be back again, and again, and again, and again for as long as they can get away with it. Same arguments each time. Back in 2018 we needed draconian cuts, apparently, and then after sustained industrial action, we didn’t any more. It’s almost as if… etc and so on. Universities may not be for-profit, but university management wants surpluses for reinvestment in their pet projects (at least some of which are genuinely good ideas) because they tend to want to make their mark. So it is in their interests to drive costs down as low as possible and keep them there.
Colleagues not paying into their pensions because contributions are too high is a problem with our pension. Even if framing the issue as if this was the sole consideration and not mentioning the, you know, massive cuts is disingenuous in the extreme.
Pensions are not a perk, but deferred salary. Organisations whose continued existence is very certain (broadly, public services) are in a position to provide better pensions. As a trade-off, salaries are lower. We knew this when we chose our careers and expect the deal to be honoured. Why should we have better pensions that some other sectors? Because that was always part of the deal.
I hate being on strike. I hate the arranging to have some of my work covered with colleagues who are themselves busy, especially when we’re several posts down. I hate the divisions it causes. I hate the stress it imposes and the difficult decisions about who to let down and how. I hate coming back to work to find that I’ve got a huge backlog that only I can clear. I don’t like not getting paid, and essentially having to work for free to catch up.
Some people like being on strike and the conflict and the associated rituals andthe ‘winter of discontent’ cosplaying just that little bit too much.
Media coverage of all industrial action is always disgracefully one-sided. Management want ‘reform’ and is presented positively… management talking points always lead and they are never challenged by the reporter. Workers are striking for ‘pay and conditions’… presented selfishly or short-shortsightedly. And are always challenged. The framing is always that of management. Always. Strikers will be vilified – ‘won’t somebody think of the [whoever is inconvenienced]’ – with no sense of awareness. The work that the strikers do isn’t important until they stop doing it, apparently. Often they’ll quote someone affected saying how annoyed they are. Again, this will be framed as the fault of the strikers rather than the failure of employers to manage their industrial relations in a competent manner. That question is never even asked, never mind answered. It’s a dance as old as time. Or at least as old as capitalism.
The Four Fights
But it’s not just about pensions. It’s about the ‘Four Fights‘ too.
address the scandal of the gender, ethnic, and disability pay gap
end contract casualisation and rising job insecurity
tackle the rising workloads driving our members to breaking point
increase to all spine points on the national pay scale of £2,500. [to make up for a 17.6% real terms pay cut between 2009-2019]
All laudable goals. Especially the pay gaps… it really ought not to be beyond the wit of folks of good will in university management and UCU to come up with an action plan to start to address this. Granted, we cannot solve the problems of discrimination and inequality in wider society, but we can do our bit, and it ought not to be that expensive. I don’t really understand why this is so hard to agree on.
The others, though. Pragmatically… how are they to be achieved? And can they be achieved without costing more? And if not, can we afford all of them, and how do we prioritise?
Let’s get a few red herrings out of the way first.
First, you might very reasonably be very cross about the above-average-inflation pay awards to some vice chancellors and some senior university staff. You might be one of those people who – consistently – thinks this is an issue across the whole economy. Or you may be one of those people who – inconsistently – thinks nothing of the worst excesses of the private sector’s snouts-in-troughery, but objects to anyone in public service being ‘paid more than the Prime Minister’. But… even if we cut executive pay by… let’s say 1/3… this will give us nowhere near enough money by itself to address any of our issues.
Second, you might form the view that there are two many ‘faceless’ managers and administrators. If so, I would invite you to (a) read this piece and reconsider; (b) reflect on the fact that ‘faceless’ just means you don’t know them or understand what they do; and (c)… we’re right here, folks. Striking alongside you. Those of us who can afford to.
Low(er) cost solutions
Let’s consider what could be done quickly and relatively cheaply. How far can humane management/HR practices go in addressing casualisation and job insecurity? A fair bit, I’d imagine. We could be much better at giving researchers permanent or open-ended contracts. Even if the reality is that redundancy processes can and will still be used to terminate contracts where there’s a lack of funding. We can treat our fixed term staff better, and we can take our duty to support the development of our staff much more seriously. We should be setting them up for their next role, whether that’s with the same institution or elsewhere.
Demand for academic posts exceeds supply. This is a topic for another blogpost, because scarcity of opportunity and funding are wicked problems which drive a lot of what’s wrong with research culture. But we could do better about not ruthlessly exploiting that fact for fun and profit. To avoid a race to the bottom for competitive advantage, we need sector wide norms and standards. And as far as I understand it (and correct me if I’m wrong, which I often am) this is what’s being resisted. I don’t believe that we have the most humane management/HR practices, and that’s why I’m reluctantly supporting industrial action on this point.
Can we tackle rising workloads without spending a lot of money? Again, there are certainly some things we can do. I’ve been coordinating my university’s response to the Review of Research Bureaucracy, which has been an eye-opening experience. It’s only looked at externally imposed bureaucracy, and only research bureaucracy. It may be that the real issue is internally imposed research bureaucracy, teaching bureaucracy (especially), and, well, administrative bureaucracy. I’m sure there’s more that can be done, and some of that may involve employing more administrative and managerial support. My vision is of a university where academics do academia, and administrators and managers do administration and management. And we do leadership together.
We might expect universities to take a long, hard look at what’s expected of the average academic and review what expectations are reasonable. Too many institutions have grant income targets that are scarcely on a nodding acquaintance with reality. They appear not to understand that limited funding means that for some to succeed, others must fail. I know it’s fashionable to blame the REF for everything. But actually the last REF rules that moved away from demanding four publications per researcher opened the door to greater flexibility of roles and expectations.
But for all the talk of ‘be kind’ and yoga and wellness and mindfulness and whatever else, there’s still far too much unkind management and unrealistic expectations. Personally, I’m currently lucky to benefit from supportive, enlightened and empowering management (hello, if you’re reading), but I’ve also experienced the opposite and there’s far too much of it about. Whether sector-wide strike action is the way to address rising workloads I’m not sure. What could we do at a national, sector-wide level? What would that look like? I’m convinced of the importance of the issue, but less so for the case for national strike action as the mechanism to resolve it. But I’m open to persuasion.
Fantasy Head of School
But another possible response to rising workloads is… well… sorry… it’s casualisation and job insecurity.
Let’s play Fantasy Head of School. Or University Management Simulator. Hypothetically, anyway. Pressures on your permanent staff too great? Use your limited resources to buy in as much extra teaching capacity as possible… which means sessional teachers and short term teaching fellowships or teaching-focused roles. Or… we treat those staff better, give them more professional development time, more scholarship time, and we get less teaching capacity for our buck. And increase workloads.
That’s not the only issue – creating better bottom-rung-of-the-academic-ladder jobs – more hours, longer contracts – almost certainly means fewer such opportunities. Is that a good thing? I think, on balance, probably… but it’s not straightforward. My one remaining reader will no doubt be sounding the ‘false dichotomy’ klaxon at this point. Correctly so. We can, of course, find a compromise or a balance of sorts, but let’s not pretend it’s straightforward. We can’t have everything.
Do we employ more staff (to reduce workloads) on more secure contracts (to reduce insecurity)? Or do we address the real terms 17.6% pay cut by increasing all spine points by £2.5k? And – dare I say it – paying higher employers’ pension contributions, if that is indeed actually needed. What gets priority? You’ll forgive me if I would rather see any £££ for pay rises focused at the lower end of the pay scale rather than giving Profs another two and half grand. Whisper it quietly, but I’d rather spend it on the lowest paid university employees who tend to be represented by UNITE or UNISON rather than UCU. And a focus on the lowest paid/lower spine grades and spine points might also be a good way to start addressing pay gaps.
Pay costs and non pay costs
As Fantasy University Manager, could we hack away at non-pay costs? Conference funding? Seedcorn funding for new research ideas? Research facilities and infrastructure? The university’s core infrastructure and systems which – when working well – create efficiency savings and minimise friction? Estates and buildings? Student spaces? Lecture theatres? Seminar rooms? One of my constant frustrations as a Research Development Manager is working with brilliant colleagues with outstanding ideas who we can’t support with kit/seedcorn ££/infrastructure as they deserve.
I’ve read in a number of places that the percentage of average university income spent on staff costs has been in decline for some time. The best source I can find for this is this UCU article from 2017. I’m wary about trying to dive into HESA stats as I’m not competent to play with financial data without water wings and a lifeguard. If anyone has any better sources/more up to date info, please let me know via twitter, email, or in the comments. This decline may or may coincide with a long run of real term pay cuts, and that may be related. Or not. I’m also not sure what the percentage of staff costs for an organisation ought to look like… my instinct is that under 54.6% seems very, very low. But I’m not sure why I think that… some half-remembered presentation? Or a Business School research grant application? But if it is low, I don’t know why that might be, or what it might mean.
I’m not sure what I think about grand estates/infrastructure projects. Obviously some have gone very well, others very badly. Can we reduce investment on estates and infrastructure to spend more on staffing? There’s a balance to be struck. One option is that we say the balance has swung too far, and we cut back and spend more on staff. Another option is that we end everything but essential maintenance to spend more on staff, but that’s not sustainable in the long run. Unless we want dilapidated lecture theatres and ageing research kit, because if that happened we’d be the first to complain about a lack of investment.
Let’s assume for the sake of argument that the pendulum has swung too far, and that there is extra money at all or most institutions if it were to cut back or delay or cancel some estates and infrastructure projects. Even on top of whatever COVID-related cuts have been made. If there is that money available, how do we spend it? Because I’m not convinced that there’s enough of a saving there to cover everything that UCU is asking for.
There isn’t a magic money tree. Pragmatically speaking. The resource envelope is what it is. Unless anyone is willing to spend, spend, spend and dare the government to shut them down or bale them out. Perhaps £££ will be increased under a future government of a more progressive frame of mind willing to invest more in public and quasi-public services. But that won’t happen in the short, or perhaps even medium term. And when it does, I suspect that universities will be some way down the priority list. As Fantasy Head of School, you need to make decisions now.
I’m aware, of course, that UCU’s demands are a wish list, a negotiating position. It’s also a way of achieving a broad consensus among colleagues whose interests are not precisely aligned. If we look at the Four Fights and the Pensions situation purely selfishly, we’d not all have the same list of priorities.
But ultimately, we have a long list of demands. Some of which can be met or addressed without prohibitively expensive measures… but for others, if there is money available, we’ll need to prioritise. And that priorisiation is likely to be controversial and uncomfortable. And we can either engage with prioritisation, or we can leave it to university management.
Most universities have internal peer review processes for grant applications. In part one I discussed the different purposes of internal peer review and how they can cause confusion. I also wrote about how to ensure that we present internal peer review as helpful and supportive rather than a hurdle to be overcome. In this second and final part, I’m going to look at how we might do internal peer review of grant applications better.
Who do we ask to review?
The ideal reviewer is a senior academic with a track record of success with major research funding applications and some insight into the subject area. Even at research-intensive institutions, there is a limited supply and their time is valuable. Especially for reviewers in development-related topics because of the volume of Global Challenges Research Fund (GCRF) bids. [Alas, this example from 2019 has dated poorly.] Our instinct is to ask senior Profs, but I wonder if a closer review by someone less senior could be more useful. We should think beyond the usual suspects, as reviewing can be a developmental exercise. My experience has been that researchers who are rarely asked to review often throw themselves into the task with a lot more enthusiasm. They’re often delighted to be asked, and keen to do a good job.
Do we make internal peer review anonymous?
This is tricky. In my view – ideally – no. Being able to put feedback in the context of the reviewer’s background can be very valuable. I also think that people should be willing to stand behind their comments.
However, because internal peer review can have a filtering role, perhaps the protection of anonymity is required for reviewers to be willing to say that proposals shouldn’t go forward. Or perhaps even to be willing to criticise colleagues’ work at all. However, I would expect that the rationale for soft filtering out an application should be one that most applicants would accept and understand. For a hard filter – when only x number of applications can go forward from the institution – there would usually be a committee decision bound by collective responsibility. I’m not aware of any research or internal survey work done on internal peer reviewers and their attitudes to anonymization, and I’d be interested to see if anyone has looked at this.
How do we ask reviewers to review?
It’s not obvious how to review a grant application. Those without much experience may be reluctant to trust their instincts or judgement because “it’s not really my area”. A small number go the other way and go power crazy at the chance to sit in judgement – judging the proposal from their own personal, partisan perspective and completely write off entire academic disciplines and sub-disciplines.
One option is to ask reviewers to use the same form that the funder in question gives to referees or panel members. It’s a great idea in principle, but academics typically have a loathe-hate relationship with forms. But there are some specific questions we could ask reviewers in a structured way, or use as prompts. Fewer questions will get better answers.
What isn’t clear? What’s confusing or ambiguous? What are the potential weaknesses? What’s missing? How could the application be improved?
If I were to ask a single question, it would be the pre-mortem.
If I could see into the future and tell you now that this application is not going to be funded, what will be the main reason?
This question helps home in on key weaknesses – it might be fit-to-call, it might be unclear methodology; it might be weak impact pathways; it might be the composition of the research team. It’s a good question for applicants to ask themselves.
How do we feed back?
It’s not enough for feedback to be correct, it must be presented in a way that maximises the chances that the PI will listen.
Ideally, I’d like a face-to-face meeting involving the internal reviewers, the Research Development Manager, the PI and possibly the co-investigators. The meeting would be a discussion of a full draft in which reviewers can offer their views and advice and the PI can respond, and ask questions about their impressions of the proposal. I like face-to-face meetings because of the feedback multiplier effect – one reviewer makes an observation, which triggers another in the second reviewer. A PI response to a particular point triggers a further observation or suggestion. If approached in the right spirit (and if well-chaired) this should be a constructive and supportive meeting aimed at maximising the applicant’s chances of success. It must not be a Dragon’s Den style ordeal.
In reality, with packed diaries and short notice calls, it’s going to be difficult to arrange such meetings. So we often have to default to email, which needs a lot of care, as nuance of tone and meaning can be lost. I would advise that feedback is sent through an intermediary – another task for your friendly neighbourhood research development manager – who can think about how to pass it on. Whether to forward it verbatim, add context or comments, or smooth off some abrasive edges. I’ve had a reviewer email me to say that she’s really busy and could I repackage her comments for forwarding? Happy to.
A good approach is to depersonalise the applicant – address the feedback to the draft application, not its authors. (“The current draft could be clearer on….” versus “You could be clearer on…”). But I think depersonalising the reviewers and their comments is a mistake – impersonal, formal language can come over as officious, high handed, and passive aggressive. It will make applicants less likely to engage, even if the advice is solid. Using (even rhetorical) questions rather than blunt statements invites engagement and reflection, rather than passing final judgement.
Which would you respond to best?
The panel’s view is that your summary section is poor and is an introduction to the topic, not a proper summary of your whole project. You should rewrite before submitting.
Could the summary be strengthened? We thought the draft version read more like an introduction to the topic, and we think reviewers are looking for a summary of the complete proposal in a nutshell. Is there time to revisit this section so it better summarises the project as a whole?
Institutions invest time and money in having arrangements that provide prospective PIs with detailed feedback from senior academic colleagues to improve their chances of success. But it’s all for nothing if the resulting advice is ineffective because of the way the feedback is communicated, or the way the whole process is presented or perceived by researchers.
Most universities have internal peer review processes for research grant applications. In the first of two articles about internal peer review, I wonder whether what ought to be valuable support can be perceived as an obstacle. Part two looks at how we might run peer review more effectively.
Why do we have internal peer review?
Internal peer review of research grant applications has two distinct functions which can easily become blurred. I think this can cause misunderstandings.
The first function is as filter – to select which applications go forward and which do not. This has two variants. A ‘hard filter’ for a scheme or funder with formal limits on the number of applications that one institution can submit. Or a ‘soft filter’ where there are no formal limits on application numbers, but there’s a steer from the funder to submit only the most competitive applications. Another motivation for a soft filter is to save academic time by slowing, stopping, or redirecting uncompetitive applications.
The second function is to improve the quality of the application. The goal is to produce some actionable suggestions for improvements to increase the chance of success. In a previous article I explained how research development staff can bring a fresh perspective. Comments from a senior academic of comparable standing to the expert reviewers or funding panel members can be similarly helpful, but with the added benefit of academic expertise.
Both functions of peer review – of filtering and improving – are often rolled together into one process. Perhaps this causes confusion both for reviewers and the reviewed. I wonder if we over-emphasise the role of the filter at the expense of the improvement? Does fear of the filter reduce the efficacy of the suggestions for improvements?
Perceptions of internal peer review
When discussing internal peer review with academic colleagues, I’ve seen wildly different reactions. Some are very enthusiastic and are hungry for comments and feedback. Others are a bit more…. Gollum and don’t want anyone to gaze upon their precious. Most are somewhere in the middle… welcoming of genuinely useful comments and insights, but wary about being forced to make changes against their better judgement or being prevented from applying at all.
There’s no denying that ‘filter’ role exists and it would be mistake to do so. I reassure academics that in my experience it’s very rare for a bid to be soft-filtered out because of internal reviewer’s comments and for the applicant to disagree with the rationale. Usually the reviewer has spotted something that the applicant missed, either related to the call, the application or underpinning idea. Perhaps it needs another co-I, or needs stronger stakeholder engagement, needs to engage with a particular body of literature, or just needs a lot more time to develop. Or the issue is the fit to funder or call.
Research development staff send out details of calls with internal timetables and internal deadlines for the various review stages. But are potential applicants seeing peer review (and associated deadlines) as a developmental process put in place to support them and to help them succeed? Or do they see peer review as a barrier to be overcome (or even evaded), placed in their path by over-officious Heads of Research and research managers that seek to micromanage, police and restrict?
I sometimes worry that in our desire to set out processes to try to prevent and pre-empt disruptive last-minute applications and set out an orderly and timely process, we end up sending the wrong message about peer review and about the broader support available. If we’re dictating terms and timetables for peer review, do we make it look as if grant applicants must fit around reviewer (and research support) requirements and timescales? And is that the right way around?
To be clear, I’m certainly not arguing against having a structured process with indicative milestones with some level of enforcement. Unplanned last minute applications are disruptive and stressful, forcing people to drop everything to provide support with no notice. Worst of all, the applications that result usually aren’t very good and rushed applications are seldom competitive. We absolutely should try to save people from this kind of folly.
And… of course, we need to allow time for senior (and therefore busy) academics to undertake internal peer review. I suspect that most institutions rely on a relatively small pool of reviewers who are asked to read and comment on multiple applications per year, and that few get any formal workload allocation. While we should certainly give applicants plenty of time to write their applications, we need to treat our reviewers with consideration and value their time.
Positive about internal peer review
I’m not arguing that we disguise or minimise the ‘filter’ element of internal peer review in favour of an unqualified upbeat presentation of internal peer review being entirely about improving the quality of the application. But perhaps we could look at ways to present internal peer review in a more positive, supportive, developmental – and less officious – light.
The most important part of peer review positivity – and the subject of the second part of this series – is in how internal peer review happens in practice: who reviews, how and when; and how and in what spirit reviewer comments are communicated to applicants. If internal peer review as a process helps strengthen applications, word will get round and support and buy-in will grow – one positive experience at a time.
But even before that stage, I think it’s worth thinking about how we communicate our internal peer review processes and timetables. Could we be more positive in our framing and communication? Could we present internal peer review more as a helping hand to climb higher, and less as a hurdle to overcome?
So… yeah… this post is a bit more personal and a lot more off topic than usual. And yes, it is mostly a build up to a request for sponsorship. Sit tight, though… I’ll have a load more post-embargo originally appeared in Research Professional content to post over the coming weeks and months.
“So, how you are doing, Adam?” “Good news and bad news, really.” “Which do you want to tell me about first?” “Both at the same time… I don’t have cancer…. any more”
In the middle of a house move, I found a lump where there probably shouldn’t have been a lump. The following day, I see a GP who agrees… that’s a lump where there shouldn’t be a lump. The day after I’m in for a blood test… the following week I’m seeing a specialist. Blood test negative (a good sign), lump is smooth and spherical (good sign), but it’s inside the testicle (bad sign). I know I’m in good hands when my specialist answers my question about what she thinks it probably is… she says she doesn’t know. Because she doesn’t. It takes confidence to admit that. Time for more tests.
The coolest person in the hospital – the Ultrasound Guy – does know. That it’s almost certainly a tumour… lots of indications that it is, nothing to indicate that it isn’t. The ultrasound guy is so ultra-sound that he’s holding clinics on Saturdays to catch up with a backlog. Phone call with the specialist a few days later confirms it… my right testicle and I will have to undergo a conscious uncoupling. Just over two weeks later – to allow for self-isolation and a negative COVID test – I’m in for surgery.
It’s a day procedure… it’s not going to be fun, but I’ve had a more serious operation before. If I can get through that, I can get through this. This too shall pass. Every decade or so, part of my body rebels against me and an example needs to be set pour encourager les autres. So it goes. It is known. At least I’ll not forget where I was when I heard the news about the death of the Duke of Edinburgh. Recovery is… slow and complicated by a post-op infection, eventually antibiotick-ed.
Week and a half later, and I’m back in for a CT scan. This time a backlog-clearing early evening appointment, held in a clinic on the sprawling, construction-scarred and largely deserted Nottingham City Hospital campus. The clinic is behind the archetypal door marked “Beware of the Leopard“, but eventually I find someone to take pity on me and give me directions. I’m late, flustered, and embarrassed, but fortunately my lateness is in perfect synchronization with their overrunningness. Also, they’re used to people being late, flustered, and embarrassed. They’re all very lovely to me, and the scanning isn’t nearly as bad as they’d led me to believe it might be.
Two weeks later, and I’m back for the results. And breathe. It’s good news. CT scan normal, biopsy shows that the tumour was small (22mm) and hadn’t spread. I had another blood test on the day, and that came back normal too. No chemo, as the marginal benefit isn’t worth the risk. There’s a very good chance the cancer won’t return, and if it does, there’s a very good chance it’s treatable. I’ll be under observation for five years or so. To paraphrase, if I absolutely insist on getting cancer, testicular cancer is the one to get. And if I absolutely insist on getting testicular cancer, get the type of testicular cancer I had, and seek medical attention immediately.
This post has been a bit flippant and contained some black humour, which is one way of coping and of making sense of things. Truth is, this was a very worrying time. There were always back-up options – if the cancer had spread, it would very probably have been very treatable. But then again, the lump had turned out to be a tumour rather than a cyst, so the odds had already gone against me once. They could do so again.
Why am I telling you all this?
Point One. Get your weird lumps and bumps checked out. Doing so will make them real, force you to drag that background ignore-able worry into the foreground where it’s harder to ignore. But think of it as consolidating your worries into a single manageable payment. It’s probably not cancer… it’s quite unlikely to be cancer. But if you check it out, you can forget about it.
Perhaps I should have started this blog post with the story of the time where I went to see my GP about a little lump on my back. Textbook cyst, said the GP. She was right. But she said I was right to come and see her about it, and I always should. True, I’ve got (nearly) all the privilege there is going, but I’m told that my experience is pretty common. Check everything out early. It’s better for you and it’s better for the NHS. The best time to get anything checked out is early. The second best time is now.
Point Two. All hail the NHS. Eight weeks from finding a lump to a result and an action plan. Eight weeks. In the middle of a global pandemic, folks. Cost to me, nil some prescription charges. Everyone is utterly lovely to me… Dr G, the GP who got me seen quickly at the beginning and antibiotic-ed me at the end. The Consultant willing to say she didn’t know. The ultrasound guy, who broke difficult news to me when he could have left it to the consultant. The whole surgical team. Everyone on oncology. In a US-style healthcare system, I dread to think about what this would have cost. I dread to think about how the cost of future cover would have restricted my professional and personal options.
Point Three. We’ve made huge progress in cancer research. For my particular flavour of cancer the research for treatments seems to have been largely done. But we’ve all lost people to cancer, many far, far too young. I lost a member of my extended family earlier this year. Shortly after I came round from surgery, I heard that a friend had died. His funeral took place on the morning of the afternoon when I received my results. He was a genuinely superb human being on every level and by every metric and I wish I’d known him better.
So we’re not yet where we need to be with cancer research. Not close. And the pandemic has been a real kick on the… teeth… for medical research in general. Their vital fundraising has been very seriously hit. Charity shops? Shut. Mass participation events, like the London Marathon? Cancelled. Not just the London Marathon, but your local city marathon or half marathon or 10k… all those sponsored walks or Races for Life? All gone. They’re struggling to honour existing research commitments, never mind fund vital new research.
If you’re wondering whether this is a build up to me asking for sponsorship for my latest act of folly, then yes, yes it is. Although… if there’s a charity that means more to you, and if you can afford it, support them instead. Or as well as. I don’t mind – the whole charity sector is struggling.
Long before my diagnosis, I’d arranged to take on the full, continuous Peak District Challenge, along with friends from my undergraduate and postgraduate days. It’s a 100km (62 miles) walk through the Peak District. Organisers estimate that the finish time – 20 – 36 hours. Sensible people attempt this over two days. We’re not sensible.
I’ve run a marathon before. Seven. But this is a very different kind of challenge. Weirdly, I’d feel happier if I were running rather than walking. All my marathons have been over inside four hours… 100km is endurance challenge that’s in a different league to anything I’ve tried before. Also, marathons don’t tend to have the Peak District in the way.
COVID has restricted our opportunities to train and prepare, especially on the right kind of terrain. Plus, you know… I’ve not been very well. It’s going to be an uphill struggle, and that includes the parts that are downhill. Go on, chuck us a few quid. Please? The link is to my friend John’s page, as we’re pooling our fundraising efforts. If anything I’ve written/tweeted has ever been any use to you, go on.
Posted inUncategorized|Comments Off on How I lost a stone in lockdown (not in a good way)
Hello and welcome to a reflective piece written by someone in a position of relative privilege in academia during a time of collapse and crisis. Written by someone who knows no more than you do about how best to cope with or understand it, and quite possibly substantially less.
So why am I writing? Out of an attempt to take stock of where we’re at as honestly as I can, without succumbing to the twin temptations of false hope of some brighter new dawn or the consolations of cynicism.
After a little reflection, this is what I want to tell you…
Kindness is everything
I don’t know who you are but listen, you’re doing really well. You probably know the saying by now: “you’re not working from home, you’re working at home in a pandemic”. And it’s true—you’re being tested in all kinds of ways, I’m sure. It’s so easy to focus on what we feel we’re not doing well that we completely take for granted the things we are doing well. This is the basis of imposter syndrome where we think of our own talents and achievements as mundane but regard those of others as vastly superior. See also: the Dunning-Kruger Effect.
We’ve got to be kinder to ourselves, as well as to others. I’ve been carrying a bit of residual guilt around because it feels like very little of the burden of the current crisis has fallen upon my shoulders. I have been able to continue working in physical safety and—in spite of some spicy days and weeks—with a manageable workload that doesn’t pose a serious risk to my mental wellbeing. As I don’t have children, I’ve not had pressures of home schooling.
It’s good to be aware that others have it tougher, to be willing to help, to show kindness and concern and empathy and consideration. To have a sense of proportion. But it’s a mistake to minimise or even discount the things that we’re finding difficult. The things we’re missing.
The fact that other people are in much, much more pain than I am doesn’t mean it hurts any less when I stub my toe. And I won’t make my toe feel any better by berating myself for being in pain and for not wanting to be in pain.
It’s easy to focus on those from whom extraordinary efforts are required during these extraordinary times, to compare ourselves to that extraordinary standard, and judge ourselves harshly. But if you’re anything like me, by this stage you’ve probably normalised a lot of the restrictions that all of us are asked to live under. Not seeing family and friends, severely curtailed leisure activities, having to adapt to remote working and so on. It’s all so [makes screaming sound] and this is the new normal.
But we should not forget that we’re all contributing. If you’re following whatever the guidelines are today, you’re contributing. I have always been ill-suited for a healthcare career due to my squeamishness and clumsiness, so perhaps I should not compare my contribution to theirs. A lot is being asked of each and every one of us even if you feel – as I do – your burden is lighter. From each according to their ability, and so on.
Won’t get fooled again
Kindness—for others, for ourselves—should be the order of the day but what is stopping it becoming the order of the everyday?
There’s a temptation to think that things must be different, will have to be better after this crisis. We should be aware that powerful forces will want to put things back more or less where they were before it ever happened (see the last financial crisis) or in even crueler positions (ibid). I’ve listened to a fewpodcasts discussing the post-1945 political settlement in the UK and the birth of the welfare state, and it’s clear is that none of that happened by accident or overnight. A lot of work went into preparing the ground and preparing the arguments and policy solutions.
If we want to “build back better” (sorry) in academia, we need to think creatively, we need to share ideas, we need to prepare the ground for radical ideas. We need to shift the Overton Window.
For one thing, we can’t do better in academia without confronting our structural inequalities. And I am sorry. Yes, this is another white, middle-age, middle-class, heterosexual, cisgender man telling everyone what he thinks about equality issues. I understand the scepticism. But in my defence there’s only one thing worse than all that: someone who is all those things and yet doesn’t think about equality issues.
Over the summer I listened to a Hidden Brain podcast on ‘Playing Favourites’ which includes a story about a Yale academic who received markedly better treatment for a hand injury once the doctors discovered that she worked at Yale. And a story about an academic who agreed to an interview she would usually decline just because the journalist had been at the same university at the same time. Both the doctor and the academic could come away from their respective interactions feeling a warm glow as they’d both done something nice for someone else that they didn’t have to.
But as the academic in that second story—Mahzarin Banaji—said: “I think that kind of act of helping towards people with whom we have some shared group identity is really the modern way in which discrimination likely happens.”
Which leaves me to ask: who gets my standard service, and who gets my above-and-beyond, my extra mile? Who gets one last extra read of their proposal? Who gets a meeting rather than an email? Who gets a longer meeting? Whose request gets the quickest response? This year my challenge to myself is (a) to keep an eye on who find I want to do favours for; and (b) look to do more favours for members of disadvantaged/unrepresented groups who may not have had their share of favours in the past. I invite you to join me. My preliminary conclusion is that I tend to privilege the pushy because I’m a people pleaser. I should do better.
My one piece of advice
I’ve only got one bit of proper, real advice for researchers and research professionals and it has got nothing to do with research or academia and it is, I am sorry, only relevant to those privileged enough not to be shielding. Go for a walk outside. Or a run, or a cycle. If you can, you should. You won’t regret it. I seldom regret going for a run, and I never regret going for a walk. Around the park, around the block, whatever. Listen to nature or the streetscape, or put in your headphones, listen to your happy tunes at top volume or your favourite podcast, and stride purposefully like you’re five minutes late for a meeting on the other side of campus.
You may or may not feel better afterwards. But at least you’ll have been for a walk.
You’re applying for UK research council funding and suddenly you’re confronted with massive overhead costs. Adam Golberg tries to explain what you need to know.
Trying to explain Full Economic Costing is not straightforward. For
current purposes, I’ll be assuming that you’re an academic applying for UK
Research Council funding; that you want to know enough to understand your
budget; and that you don’t really want to know much more than that.
If you do already know a lot about costing or research finances, be warned – this article contains simplifications, generalisations, and omissions, and you may not like it.
What are Full Economic Costs, and why are they taking up so much of my budget?
Full Economic Costs (fEC) are paid as part of UK Research and Innovation grants to cover a fair share of the wider costs of running the university – the infrastructure that supports your research. There are a few different cost categories, but you don’t need to worry about the distinctions.
Every UK university calculates its own overhead rates using a common methodology. I’m not going to try to explain how this works, because (a) I don’t know; and (b) you don’t need to know. Most other research funders (charities, EU funders, industry) do not pay fEC for most of their schemes. However, qualifying peer-reviewed charity funding does attract a hidden overhead of around 19% through QR funding (the same source as REF funding). But it’s so well hidden that a lot of people don’t know about it. And that’s not important right now.
How does fEC work?
In effect, this methodology produces a flat daily overhead rate to be charged
relative to academic time on your project. This rate is the same for the time
of the most senior professor and the earliest of early career researchers.
One effect of this is to make postdoc researchers seem proportionally more expensive. Senior academics are more expensive because of higher employment costs (salary etc), but the overheads generated by both will be the same. Don’t be surprised if the overheads generated by a full time researcher are greater than her employment costs.
All fEC costs are calculated at today’s rates. Inflation and increments
will be added later to the final award value.
Do we have to charge fEC overheads?
Yes. This is a methodology that all universities use to make sure that
research is funded properly, and there are good arguments for not undercutting
each other. Rest assured that everyone – including your competitors– are
playing by the same rules and end up with broadly comparable rates. Reviewers
are not going to be shocked by your overhead costs compared to rival bids. Your
university is not shooting itself (or you) in the foot.
There are fairness reasons not to waive overheads. The point of Research
Councils is to fund the best individual research proposals regardless of the
university they come from, while the REF (through QR) funds for broad,
sustained research excellence based on historical performance. If we start
waiving overheads, wealthier universities will have an unfair advantage as they
can waive while others drown.
Further, the budget allocations set by funders are decided with fEC overheads in mind. They’re expecting overhead costs. If your project is too expensive for the call, the problem is with your proposal, not with overheads. Either it contains activities that shouldn’t be there, or there’s a problem with the scope and scale of what you propose.
However, there are (major) funding calls where “evidence of institutional
commitment” is expected. This could include a waiver of some overheads, but
more likely it will be contributions in kind – some free academic staff time, a
PhD studentship, new facilities, a separate funding stream for related work.
Different universities have different policies on co-funding and it probably
won’t hurt to ask. But ask early (because approval is likely to be complex) and
have an idea of what you want.
What’s this 80% business?
This is where things get unnecessarily complicated. Costs are calculated
at 100% fEC but paid by the research councils at 80%. This leaves the remaining
20% of costs to be covered by the university. Fortunately, there’s enough money
from overheads to cover the missing 20% of direct costs. However, if you have a
lot of non-pay costs and relatively
little academic staff time, check with your costings team that the project is
Why 80%? In around 2005 it was deemed ‘affordable’ – a compromise figure intended to make a significant contribution to university costs but without breaking the bank. Again, you don’t need to worry about any of this.
Can I game the fEC system, and if so, how?
Academic time is what drives overheads, so reducing academic time reduces
overheads. One way to do this is to think about whether you really need as much
researcher time on the project. If you really need to save money, could
contracts finish earlier or start later in the project?
Note that non-academic time (project administrators, managers,
technicians) does not attract overheads, and so are good value for money under
this system. If some of the tasks you’d like your research associate to do are
project management/administration tasks, your budget will go further if you
cost in administrative time instead.
However, if your final application has unrealistically low amounts of academic time and/or costs in administrators to do researcher roles, the panel will conclude that either (a) you don’t understand the resource implications of your own proposal; or (b) a lack of resources means the project risks being unable to achieve its stated aims. Either way, it won’t be funded. Funding panels are especially alert for ‘salami projects’ which include lots of individual co-investigators for thin slivers of time in which the programme of research cannot possibly be completed. Or for undercooked projects which put too much of a burden on not enough postdoc researcher time. As mentioned earlier, if the project is too big for the call budget, the problem is with your project.
The best way to game fEC it is not to worry about it. If you have support with your research costings, you’ll be working with someone who can cost your application and advise you on where and how it can be tweaked and what costs are eligible. That’s their job – leave it to them, trust what they tell you, and use the time saved to write the rest of the application.
Nathaniel Golden (Nottingham Trent) and Jonathan Hollands (University of
Nottingham) for invaluable comments on earlier versions of this article. Any
errors that remain are my own.
The relentless drive for research excellence has created a culture in modern science that cares exclusively about what is achieved and not about how it is achieved.
As I speak to people at every stage of a scientific career, although I hear stories of wonderful support and mentorship, I’m also hearing more and more about the troubling impact of prevailing culture.
People tell me about instances of destructive hyper-competition, toxic power dynamics and poor leadership behaviour – leading to a corresponding deterioration in researchers’ wellbeing. We need to cultivate, reward, and encourage the best while challenging what is wrong.
We know that Wellcome has helped to create this focus on excellence. Our aim has rightly been to support research with the potential to benefit society. But I believe that we now also have an important role to play in changing and improving the prevailing research culture. A culture in which, however unintentionally, it can be hard to be kind.
If we want science to be firing on all cylinders, we need everyone in the research system – individuals, institutions and funders – working in step to foster a positive working culture.
Which leads me to wonder what the role of research development and other research support professionals should be in moving towards a more positive research culture. I don’t know the answer, and this post is an open invitation to share your thoughts. I’ll pull these together into a crowd-sourced post with credit for those who want it and anonymity for those who don’t. This approach seemed to work well for a previous post around supporting a new academic discipline, so perhaps it will work here too.
I don’t want to say too much in this post, but as I’m asking others I should at least share a few indicative thoughts about areas to think about.
We should look at our own profession, our own culture, and how we treat each other. In my time in research development I’ve generally found it to be a supportive profession, both internally within the universities where I’ve worked, and (especially) externally through ARMA. However, I’m white, male, heterosexual, middle age, middle class, so I’m very much playing on ‘easy mode‘. I don’t get mistaken for an administrator, and either I’m super diplomatic and great at influencing and persuading, or I get taken more seriously by some people because of my jackpot of categories of privilege. As I’ve alluded to on this blog before, I do have a slight stammer and have written about the challenges that can cause me, but it that has seldom held me back and I don’t think it’s affected how I’m perceived.
In terms of our own profession and our own behaviour, the phrase “be the change you want to see in the world” came to mind. Although… when I went to google to find out who said it, I found an interesting blog post that arguing that Mahatma Gandhi (to whom it is usually attributed) said and meant something rather more different and much more challenging. It’s not simply about living our values, but reflecting on them and changing ourselves where necessary. As a philosopher by training I also thought about Aristotle and his writings on the importance of character and virtues – if you nurture the right character and the right virtues, the chances that you’ll respond in the right way when tested or under pressure will be higher. But how do we do that? Practice, reflection, courage, and learning from the example of others, both positive and negative.
Less esoterically, a second category of issues is around our role in supporting research and researchers, especially around grant getting and grant writing activity. Competition for funding, low success rates, increasingly long and complicated application forms, and pressure from university management form part of research culture. While we rarely have formal power or authority over academic staff, we do have a measure of influence on research culture.So how do we use that influence and our roles for good? What’s our role in preventing research excellence coming at the expense of those who make it happen – which includes us, in our small way. I’ll kick things off with three issues I’ve been thinking about recently…
Firstly, forwarding funding opportunities and supporting applications. When I send funding opportunities onto academics, am I guilty of unconscious bias? Am I committing the availability error and just emailing the first people who come to mind? Does that mean some people with certain characteristics are more likely to receive those emails than others? Does unconscious bias affect how I respond to tentative enquiries about opportunities, or about how I divide my time between proposals?
Honest answer is that I don’t know. But I’ve been influenced by the pushback against ‘manels’ (all male panels at conferences)… and if my funding opportunity distribution list looks like a manel, especially a white manel (because intersectionality is key) I’m taking time to stop and think about who I might have missed. Sometimes structural inequalities or call specifics mean that I got it right first time, but it’s worth a check.
Secondly, what’s our role around workload and work life balance? Could we do more to minimise the burden on researchers at all levels of seniority? Partly this is around efficiency and systems and processes, but partly I think there are cultural issues to consider too. I recently had a discussion with organisers of a research network which ran funding calls about the appropriateness of having a deadline of (something like) 23:59 on Sunday evening. The argument was that academics preferred this because it gave them more time than, say a Friday 4:00pm deadline. But it’s time over a weekend, and arguably this increases the expectation that academics work weekends. When do we set our internal deadlines for various tasks, from REF reviews to internal peer reviews to internal deadlines for draft applications? Do we assume that academic colleagues will be working weekends?
Thirdly, when we advise on the staffing of research projects, are we creating good jobs with fair salaries and training career development opportunities? The issue of ‘good jobs’ on research projects (for academics and managers/administrators) was something that Wellcome brought up at a visit I attended a few weeks ago. I have to admit that under cost pressures on UKRI applications, there’s a strong incentive to try to cut researcher time as much as possible to reduce both employment costs and overheads. Of course, we should never over-cost for any post for any funder, but likely I’ve had a role in creating (potential) jobs that are lower quality than they might otherwise be.
That’s probably enough for now – this was supposed to be a short post. But this is an open invitation to email me with any thoughts you have about challenges we face, or steps we might take, in responding to the Wellcome Trust’s challenge to reimagine how we do research. I’ll be sharing this invitation via the ARMA Research Development email list and via Twitter for greater international reach.
I’ve recently moved from a role supporting the Business School and the School of Economics to a central role at the University of Nottingham, looking after our engagement with research charities. I’m going from a role where I know a few corners of the university very well to a role where I’m going to have to get to know more about much more of it.
My academic background (such as it is) is in political
philosophy and for most of my research development career I’ve been supporting
(broadly) social sciences, with a few outliers. I’m now trying to develop my
understanding of academic disciplines that I have little background or
experience in – medical research, life sciences, physics, biochemistry etc. I
suspect the answer is just time, practice, familiarity, confidence (and
Wikipedia), but I found myself wondering if there are any short cuts or
particularly good resources to speed things up.
Fortunately, if you’re a member of ARMA, you’re never on your own, and I sent an email around the Research Development Special Interest Group email list, with a promise (a) to write up contributions as a blog post and (b) to add some hints and tips of my own, especially for the social sciences.
So here goes… the collated and collected wisdom of the SIG… bookmark this post and revisit it if your remit changes…
Don’t panic… and focus on what you can do
In my original email, the first requirement I suggested was ‘time’, and that’s been echoed in a lot of the responses. “Time, practice, familiarity, confidence (and Wikipedia)” as Chris Hewson puts it. It’s easy to be overwhelmed by a sea of new faces and names and an alphabet soup of new acronyms- and to regard other people’s hard-won institutional/school/faculty knowledge as some kind of magical superpower.
Lorna Wilson suggests that disciplinary differences are overrated and “sometimes the narrative of ‘difference’ is what makes things harder. The skills and expertise we have as research development professionals are transferable across the board, and I think that the silos of disciplines led to a silo-ing of roles (especially in larger universities). With the changes in the external landscape and push with more challenge-led interdisciplinary projects, the silos of disciplines AND of roles I think is eroding.”
But there are differences in practices and norms – there are differences in terminology, outlook, career structures, internal politics, norms, and budget sizes – and I’m working hard trying not to carry social science assumptions with me. Though perhaps I’m equally likely to be too hesitant to generalise from social science experience where it would be entirely appropriate to do so.
Rommany Jenkins has “moved from Arts and Humanities to Life Sciences” and thinks that while “the perception might be that it’s the harder direction to go in because of the complexity of the subject matter […] it’s probably easier because the culture is quite straightforward […] although there are differences between translational / clinical and basic, the principles of the PI lab and team are basically the same”. She thinks that perhaps “it’s more of a culture shock moving into Arts and Humanities, because people are all so independently minded and come at things from so many different directions and don’t fit neatly into the funding boxes. […] I know a lot of people just find it totally bizarre that you can ask a Prof in Arts what they need in terms of costings and they genuinely don’t know.”
Charlotte Johnson moved in the opposite direction, from science to arts. “The shortcut was trying to find commonalities in how the different disciplines think and prepare their research. Once you realise that an artist and a chemist would go about planning their research project very similarly, and they only start to diverge in the experimental/interpretation stage, it does actually make it all quite easy to understand“
Muriel Swijghuisen Reigersberg says that her contribution “tends to be not so much on the science front, but on the social and economic or policy and political implications of the work STEMM colleagues are doing and recommendations around impact and engagement or even interdisciplinary angles to enquiries for larger projects.”
My colleague Liz Humphreys makes a similar (and very reassuring) point about using the same “skills to assess any bid by not focusing on the technical things but focus on all the other usual things that a bid writer can strengthen”. A lay summary that doesn’t make any lay sense is an issue regardless of discipline, as is a summary that doesn’t summarise that’s more of an introduction. Getting good at reviewing research grants can transcend academic disciplines. “If someone can’t explain to me what they’re doing,” says Claire Edwards, “then it’s unlikely to convince reviewers or a panel.”
Kate Clift make a similar point: “When I am working in a discipline which is alien to me I tend to try and ground the proposed research in something which I do understand so I can appreciate the bigger picture, context etc. I will ask lots of ‘W’ questions – Why is it important? What do you want to do? Who is going to do it? Less illuminating to me in this situations is HOW they are going to do it”.
Roger Singleton Escofet makes the very sensible point that some subjects are very theoretical “where you will always struggle to understand what is being proposed”. I certainly found this with Economics – I could hope to try to understand what a proposed project did, but how it worked would always be beyond me. Reminds me a bit of this Armstrong and Miller sketch in which they demonstrate how not to do public engagement in theoretical physics.
Ann Onymous-Contributor says that “multidisciplinary projects are the best way to ease yourself into other disciplines and their own specific languages. My background is in social sciences but because of the projects I have worked on I have experience of, and familiarity with a range of arts and hard science disciplines and the languages they use. Broad, shallow knowledge accumulated on this basis can be very useful; sometimes specific disciplinary knowledge is less important than understanding connections between different disciplines, or the application of knowledge, which typically also tend to be the things which specialists miss.” I think this is a really good point – if we allow ourselves it include the other disciplines that we’ve supported as part of interdisciplinary bids, we may find we’ve more experience that we thought.
Finding the Shallow End, Producing your Cheat Sheet
Lorna Wilson suggests “[h]aving a basic understanding” of methodologies in different disciplines, “helps to demonstrate how [research questions] are answered and hypotheses evidenced, and I think breaks through some of the ‘difference’. What makes things slightly more difficult is also accessibility, in terms of language of disciplines, we could almost do with a cheat sheet in terms of terms!”
Richard Smith suggests identifying academics in the field who are effective and willing communicators “who appreciate the benefits and know the means of conveying approaches and fields to non-experts… and do it with enthusiasm”. Harry Moriarty’s experience has been that often ECRs and PhD students are a particularly good source – many are more willing to engage, and perhaps have more to benefit from our advice and support.
Muriel Swijghuisen Reigersberg suggests attending public
lectures (rather than expert seminars) which will be aimed at the generalist,
and notes that expert-novice conversations will benefit the academic expert in
terms of practising explanations of complex topics to a generalist audience. I
think we can all recognise academics who enjoy talking about their work to
non-specialists and with a gift for explanations, and those who don’t, haven’t
Other non-academic colleagues can help too, Richard argues – especially impact and public or business engagement staff working in that area, but also admin staff and School managers. Sanja Vlaisavljevic wanted to “understand how our various departments operate, not just in terms of subject-matter but the internal politics”. This is surely right – I’m sure we’re all aware of historical disagreements or clashes between powerful individuals or whole research groups/Schools that stand in the way of certain kinds of collaboration or joint working. Whether we work to try to erode these obstructions or navigate deftly around them, we need to know that they’re there.
Caroline Moss-Gibbons adds librarians to the list, citing their resource guides and access/role with the university repository. Claire Edwards observes that many research development staff have particular academic backgrounds that might be useful.
Don’t try to fake it till you make it
“Be open that you’re new to the area, but if they’re looking for funding they need to be able to explain their research to a non-specialist” says Jeremy Barraud.
I’ve always found that a full, frank, and even cheerful confession of a lack of knowledge is very effective. I often include a blank slide in presentations to illustrate what I don’t know. My experience is that admitting what I don’t know earns me a better hearing on matters that I do know about (as long as I do both together), but I’m aware that as a straight, white, middle aged, middle class male perhaps that’s easier for me to do. I’ve suspected for some time now that being male (and therefore less likely to be mistaken for an “administrator”) means I’m probably playing research development on easy mode. There’s an interesting project around EDI and research development that I’m probably not best placed to do.
While no-one is arguing for outright deception, I’ve heard
it argued that frank admissions of ignorance about a particular topic area may
make it harder to engage academic colleagues and to find out more. If academic
colleagues make certain assumptions about background, perhaps try to live up to
those with a bit of background reading. It’s easy to be written off and written
out, which then makes it harder to learn later.
I always think half the battle is convincing academic colleagues that we’re on their side and the side of their research (rather than, say, motivated by university income targets or an easier life), and perhaps it’s easy to underestimate the importance of showing an interest and a willingness to learn. Asking intelligent, informed, interested lay questions of an expert – alongside demonstrating our own expertise in grant writing etc – is one way to build relationships. My own experience with my MPhil is that research can be a lonely business, and so an outsider showing interest and enthusiasm – rather than their eyes glazing over and disengaging – can be really heartening.
Kate Clift makes an important point about combining any admissions of relative ignorance with a stress on what she can do/does know/can contribute. “I’m always very upfront with people and say I don’t have an understanding of their research but I do understand how to craft a submission – that way everyone plays to their strengths. I can focus on structure and language and the academic can focus on scientific content.”
Find a niche, get involved, be visible
For Jeremy Barraud, that was being secretary for an ethics committee. In my early days with Economics, it was supporting the production of the newsletter and writing research summaries – even though it wasn’t technically part of my remit, it was a great way to get my name known, get to know people, and have a go at summarising Economics working papers.
Suzannah Laver is a research development manager in a Medical School, but has a background in project management and strategy rather than medicine or science. For her it was “just time” and getting involved “[a]ttending the PI meetings, away days, seminars, and arranging pitching events or networking events.” Mary Caspillo-Brewer adds project inception meetings and dissemination events to the list, and also suggests attending academic seminars and technical meetings (as does Roger Singleton Escofet), even if they’re aimed at academics. This is great in terms of visibility and in terms of evidence of commitment – sending a message that we’re interested and committed, even if we don’t always entirely understand.
Mark Smith suggests visiting research labs or clinics, however terrifying they may first appear. So far I’ve only met academics in their offices – I’m not sure I trust myself anywhere near a lab. I’m still half-convinced I’ll knock over the wrong rack of test tubes and trigger a zombie epidemic. But lab visits are perhaps something I could do more of in the future when I know people better. And as Mark says, taking an interest is key.
Do your homework
I’ve blogged before about the problems with the uses and abuses of successful applications, but Nat Golden is definitely onto something when he suggests reading successful applications to look at good practice and what the particular requirements of a funder are. Oh, and reading the guidance notes.
Roger Singleton Escofet (and others) have mentioned that the Royal Society and Royal Academy of Engineering produce useful reports that “may be technical but offer good overviews on topical issues across disciplines. Funders such as research councils or Wellcome may also be useful sources since funders tend to follow (or set) the emerging areas.” Hilary Noone also suggests looking to the funders for guidance – trying to “understand the funders real meaning (crucial for new programmes and calls where they themselves are not clear on what they are trying to achieve)”.
There’s a series of short ‘Bluffer’s Guide’ books which are somewhat dated, but potentially very useful. Bluff your way in Philosophy was on my undergraduate reading list. Bluff your way in Economics gave me an excellent grounding when my role changed, and explained (among many other things) the difference between exogenous and endogenous factors. When supporting a Geography application, I learned the difference between pluvial and fluvial flooding. These little things make a difference, and it’s probably the absence of that kind of basic ground for many disciplines that I’m now supporting that’s making me feel uneasy. In a good way.
Harry Moriarty argues that it’s more complicated than just reading Wikipedia – the work he supported “was necessarily at the cutting edge and considerably beyond the level that I could get to in a sensible order – I had to take the work and climb back through the Wikipedia pages in layers, and then, once I had some underpinning knowledge, go back through the same pages in light of my new understanding”.
Specific things to do
“Become an NIHR Public Reviewer”, says Jeremy Barraud. “It’s easy to sign up and they’re keen to get more reviewers. Being on the other side of the funding fence gives a real insight into how decisions are reached (and bolsters your professional reputation when speaking with researchers). “
I absolutely second this – I’ve been reviewing for NIHR for some time and just finished a four year term as a patient/public representative on a RfPB panel. I’d recommend doing this not just to gain experience of new research areas, but as a valuable public service that you as a research development professional can perform. If you’ve got experience of a health condition, using NHS services (as a patient or carer), and you’re not a healthcare professional or researcher, I’m sure they’d love to hear from you.
Being a research participant, argues Jeremy Barraud, is “professionally insightful and personally fulfilling. The more experience you have on research in all its different angles, the better your professional standing”. This is also something I’ve done – in many ways it’s hard not to get involved in research if you’re hanging around a university. I’m part of a study looking at running and knee problems, and I’ve recently been invited to participate in another study.
Bonhi Bhattacharya registered for a MOOC (Massively Open Online Courses) – an “Introduction to Ecology” – Bonhi is a mathematician by training – “and it was immensely helpful in getting a grounding in the subject, as well as a useful primer in terminology.“ It can be a bit of a time commitment, but they’re also fascinating – and as above, really shows willing. I wrote about my experience with a MOOC on behavioural economics in a post a few years ago. Bonhi also suggests reading academics’ papers – even if only the introduction and conclusion.
Subscribe to The Conversation, says Claire Edwards, it’s “a great source of academic content aimed at a non-specialist audience”. In a similar vein, Helen Walker recommends the Wellcome-funded website Mosaic which is “great for stories that give the bigger picture ‘around’ science/research – sometimes research journeys, sometimes stories showing the broader context of science-related research.” Both Mosaic and The Conversation have podcast companions. Recent Conversation podcast series have looked at the Indian elections and moon exploration.
I’m a huge fan of podcasts, and there are loads that can help with gaining a basic understanding of new academic areas – in addition to being interesting (and sometimes amusing).
A quick search of the BBC has identified four science podcasts I should think about listening to – The Science Hour, Discovery, and BBC Inside Science. Very open to other suggestions – please tweet me or let me know in the comments/via email.
A huge thank you to all contributors:
I’m very grateful to everyone for their comments. I’ve not been able to include everything everyone said, in the interests of avoiding duplication/repetition and in the interests of keeping this post to a manageable length.
I don’t think there’s any great secret to success in supporting a new discipline or working in research development in a new institution – it’s really a case of remembering and repeating the steps that worked last time. And hopefully this blog post will serve as a reminder to others, as it is doing to me.
Jeremy Barraud is Deputy Director, Research Management and Administration, at the University of the Arts, London.
Bonhi Bhattacharya is Research Development Manager at the University of Reading
Mary Caspillo-Brewer is Research Coordinator at the Institute for Global Health, University College London
Kate Clift is Research Development Manager at Loughborough University
Anne Onymous-Contributor is something or other at the University of Redacted
Claire Edwards is Research Bid Development Manager at the University of Surrey.
Adam Forristal Golberg is Research Development Manager (Charities), at the University of Nottingham
Nathanial Golden is Research Development Manager (ADHSS) at Nottingham Trent University
Chris Hewson is Social Science Research Impact Manager at the University of York
Liz Humphreys is Research Development Manager for Life Sciences, University of Nottingham
Rommany Jenkins is Research Development Manager for Medical and Dental Sciences, University of Birmingham.
Charlotte Johnson is Senior Research Development Manager, University of Reading
Suzannah Laver is Research Development Manager at the University of Exeter Medical School
Harry Moriarty is Research Accelerator Project Manager at the University of Nottingham.
Caroline Moss-Gibbons is Parasol Librarian at the University of Gibraltar.
Hilary Noone is Project Officer (REF Environment and NUCoREs0, at the University of Newcastle
Roger Singleton Escofet is Research Strategy and Development Manager for the Faculty of Science, University of Warwick.
Mark Smith is Programme Manager – The Bloomsbury SET, at the Royal Veterinary College
Richard Smith is Research and Innovation Funding Manager, Faculty of Arts, Humanities and Social sciences, Anglia Ruskin University.
Muriel Swijghuisen Reigersberg is Researcher Development Manager (Strategy) at the University of Sydney.
Sanja Vlaisavljevic is Enterprise Officer at Goldsmiths, University of London
Helen Walker is Research and Innovation Officer at the University of Portsmouth
Lorna Wilson is Head of Research Development, Durham University
I’m writing this in the final week of my current role as Research Development Manager (Social Sciences) at the University of Nottingham before I move to my role as Research Development Manager (Research Charities) at the University of Nottingham. This may or may not change the focus of this blog, but I won’t abandon the social sciences entirely – not least because I’m stuck with the web address.
I’ve been thinking about strategies and approaches to research funding, and the place and prioritisation of applying for research grants in academic structures. It’s good for institutions to be ambitious in terms of their grant getting activities. However, these ambitions need to be at least on a nodding acquaintance with: (a) the actual amount of research funding historically available to any given particular discipline; and (b) the chances of any given unit or school or individual to compete successfully for that funding given the strength of the competition.
To use a football analogy, if I want my team to get promotion, I should moderate my expectations in the light of how many promotion places are available, and how strong the likely competition for those limited spots will be. In both cases, we want to set targets that are challenging, stretching, and ambitious, but which are also realistic and informed by the evidence.
How do we do that? Well, in a social science context, a good place to start is the ESRC success rates, and other disciplines could do worse than take a similar approach with their most relevant funding council. The ESRC produce quite a lot of data and analysis on funding and success rates, and Alex Hulkes of the ESRC Insights team writes semi-regular blog posts. Given the effort put into creating and curating this information, it seems only right that we use it to inform our strategies. This level of transparency is a huge (and very welcome) change from previous practices of very limited information being rather hidden away. Obvious caveats – the ESRC is by no means the only funder in town for the social sciences, but they’re got the deepest pockets and offer the best financial terms. Another (and probably better) way would be to compare HESA research income stats, but let’s stick to the ESRC for now.
The table below shows the running three year total (2015/6- 2017/18) and number of applications for each discipline for all calls, and the total for the period 2011/12 to 2017/8. You can access the data for yourself on the ESRC web page. This data is linked as ‘Application and success rate data (2011-12 to 2017-18)’ and was published in ODS format in May 2018. For ease of reading I’ve hidden the results from individual years.
Lots of caveats here. Unsuccessful outline proposals aren’t included (as no outline application leads directly to funding), but ‘office rejects’ (often for eligibility reasons) are. The ‘core discipline’ of each application is taken into account – secondary disciplines are not. The latest figures here are from 2017-2018 (financial year), so there’s a bit of a lag – in particular, the influence of the Global Challenges Research Fund (GCRF) or Industrial Strategy Challenge Fund (ISCF) will not be fully reflected in these figures. I think the ‘all data’ figures may include now-defunct schemes such as the ESRC Seminar Series, though I think Small Grants had largely gone by the start of the period covered by these figures.
Perhaps most importantly, because these are the results for all schemes, they include targeted calls which will rarely open to all disciplines equally. Fortunately, the ESRC also publishes similar figures for their open call (Standard) Research Grants scheme for the same time period. Note that (as far as I can tell) the data above includes the data below, just as the ‘all data’ column (which goes back to 2011/2) also includes the three year total.
This table is important because the Research Grants Scheme is bottom-up, open-call, and open to any application that’s at least 50% social sciences. Any social science researcher could apply to this scheme, whereas directed calls will inevitably appeal only to a subset. These are the chances/success rates for those whose work does not fit squarely into a directed scheme and could arguably be regarded as a more accurate measure of disciplinary success rates. It’s worth noting that a specific call that’s very friendly to a particular discipline is likely to boost the successes but may decrease the disciplinary success rate if it attracts a lot of bids. It’s also possible that major targetted calls that are friendly to a particular disciplin may result in fewer bids to open call.
To be fair, there are a few other regular ESRC schemes that are similarly open and should arguably be included if we wanted to look at the balance of disciplines and what a discipline target might look like. The New Investigator Scheme is open in terms of academic discipline, if not in time-since-PhD, and the Open Research Area call is open in terms of discipline if not in terms of collaborators. The Secondary Data Analysis Initiative is similarly open in terms of discipline, if not in terms of methods. Either way, we don’t have (or I can’t find) data which combines those schemes into a non-directed total.
Nevertheless, caveats and qualifications aside, I think these two tables give us a good sense of the size of prize available for each discipline. There’s approxinately 29 per year (of which 5 open call) for Economics, and 11 per year (of which 2 open call) for Business and Management. Armed with that information and a knowledge of the relative strength of the discipline/school in our own institution, we ought to get a sense of what a realistic target might look like and a sense of how well we’re already doing. Given what we know about our expertise, eminence, and environment, and the figures for funded projects, what ought our share of those projects be?
We could ask a further question about how those successes are distributed between universities and about any correllation between successes and (unofficial) subject league tables from the last REF, calculated on the basis of Grade Point Average or Research power. However, even if that data were available, we’d be looking at small numbers. We do know that the ESRC have done a lot of work on looking at funding distribution and concentration and their key findings are that:
ESRC peer review processes do not concentrate funding to a degree greater than that apparent in the proposals that request the funding.
ROs which apply infrequently appear to have lower success rates than do those which are more active applicants
In other words, most universities typically have comparable succcess rates except that those that apply more often do a little better than average, those who apply rarely do a little worse. This sounds intuitively right – those who apply more are likely more research-active, at least in the social sciences, and therefore more likely to generate stronger applications. But this is at an overall level, not discipline level.
I’d also note that we shouldn’t only measure success by the number of projects we lead. As grants get larger on average, there’s more research income available for co-investigators on bids leds elsewhere. I think a strategy that focuses only on leading bids and being lead institution neglects the opportunties offered by being involved in strong bids led by world class researchers based elsewhere. I’m sure it’s not unusual for co-I research income to exceed PI income for academic units.
I’ve not made any comment about the different success rates for different disciplines. I’ve written about this already for many of the years covered by the full data (though Alex Hulkes has done this far more effectively over the last few years, having the benefit of actual data skills) and I don’t really want to cover old ground again. The same disparities continue much as before. Perhaps GCRF will provide a much-needed boost for Education research (or at least the international aspects) and ISCF for management and business research.