Using Social Media to Support Research Management – ARMA training and development event

Last week I gave a brief presentation at a training and development event organised by ARMA (Association of Research Managers and Administrators) entitled ‘Using Social Media to Support Research Management’. Also presenting were Professor Andy Miah of the University of Salford, Sierra Williams of the LSE Impact of Social Sciences blog, Terry Bucknell of Altmetric. and Phil Ward of Fundermentals and the University of Kent.   A .pdf of my wibblings as inflicted can be found here.

I guess there are three things from the presentation and from the day as a whole that I’d pick out for particular comment.

Firstly, if you’re involved in research management/support/development/impact, then you should be familiar with social media, and by familiar I don’t mean just knowing the difference between Twitter and Friends Reunited – I mean actually using it. That’s not to say that everyone must or should dash off and start a blog – for one thing, I’m not sure I could handle the competition. But I do think you should have a professional presence on Twitter. And I think the same applies to any academics whose research interests involve social media in any way – I’ve spoken to researchers wanting to use Twitter data who are not themselves on Twitter. Call it a form of ethnography if you like (or, probably better, action research), I think you only really understand social media by getting involved – you should “inhabit the ecosystem”, as Andy Miah put it in a quite brilliant presentation that you should definitely make time to watch.

I’ve listed some of the reasons for getting involved, and some of the advantages and challenges, in my presentation. But briefly, it’s only by using it and experiencing for yourself the challenge of finding people to follow, getting followers, getting attention for the messages you want to transmit, risking putting yourself and your views out there that you come to understand it. I used to just throw words like “blog” and “twitter” and “social media engagement” around like zeitgeisty confetti when talking to academic colleagues about their various project impact plans, without understanding any of it properly. Now I can talk about plans to get twitter followers, strategies to gain readers for the project blog, the way the project’s social media presence will be involved in networks and ecosystems relevant to the topic.

One misunderstanding that a lot of people have is that you have to tweet a lot of original content – in fact, it’s better not to. Andy mentioned a “70/30” rule – 70% other people’s stuff, 30% yours, as a rough rule of thumb. Even if your social media presence is just as a kind of curator – finding and retweeting interesting links and making occasional comments, you’re still contributing and you’re still part of the ecosystem, and if your interests overlap with mine, I’ll want to follow you because you’ll find things I miss. David Gauntlett wrote a really interesting article for the LSE impact blog on the value of “publish, then filter” systems for finding good content, which is well worth a read. Filtering is important work.

The second issue I’d like to draw out is an issue around personal and professional identity on Twitter. When Phil Ward, Julie Northam, David Young and I gave a presentation on social media at the ARMA conference in 2012, many delegates were already using Twitter in a personal capacity, but were nervous about mixing the personal and professional. I used to think this was much more of a problem/challenge than I do now. In last week’s presentation, I argued that there were essentially three kinds of Twitter account – the institutional, the personal, and what I called “Adam at work”. Institutional wears a shirt and tie and is impersonal and professional. Personal is sat in its pants on the sofa tweeting about football or television programmes or politics. Adam-at-work is more ‘smart casual’ and tweets about professional stuff, but without being so straight-laced as the institutional account.

Actually Adam-at-Work (and, for that matter You-at-Work) are not difficult identities to work out and to stick to. We all manage it every day.  We’re professional and focused and on-topic, but we also build relations with our office mates and co-workers, and some of that relationship building is through sharing weekend plans, holidays, interests etc. I want to try to find a way of explaining this without resorting to the words “water cooler” or (worse) “banter”, but I’m sure you know what I mean. Just as we need to show our human sides to bond with colleagues in everyday life, we need to do the same on Twitter. Essentially, if you wouldn’t lean over and tell it to the person at the desk next to you, don’t tweet about it. I think we’re all well capable of doing this, and we should trust ourselves to do it. By all means keep a separate personal twitter account (because you don’t want your REF tweets to send your friends to sleep) and use that to shout at the television if you’d like to.

I think it’s easy to exaggerate the dangers of social media, not least because of regular stories about people doing or saying something ill-advised. But it’s worth remembering that a lot of those people are famous or noteworthy in some way, and so attract attention and provocation in a way that we just don’t. While a footballer might get tweeted all kinds of nonsense after a poor performance, I’m unlikely to get twitter-trolled by someone who disagrees with something I’ve written, or booed while catching a train. Though I do think a football crowd style crescendo of booing might be justified in the workplace for people who send mass emails without the intended attachment/with the incorrect date/both.

Having said all that… this is just my experience, and as a white male it may well be that I don’t attract that kind of negative attention on social media. I trust/hope that female colleagues have had similar positive experiences and I’ve no reason to think they haven’t, but I don’t want to pass off my experience as universal. (*polishes feminist badge*).

The third thing is to repeat an invitation which I’ve made before – if anyone would like to write a guest post for my blog on any topic relevant to its general themes, please do get in touch. And if anyone has an questions about twitter, blogging, social media that they think I might have a sporting chance of answering, please ask away.

Grant Writing Mistakes part 94: The “Star Wars”

Have you seen Star Wars?  Even if you haven’t, you might be aware of the iconic opening scene, and in particular the scrolling text that begins

“A long time ago, in a galaxy far, far away….”

(Incidentally, this means that the Star Wars films are set in the past, not the future. Which is a nice bit of trivia and the basis for a good pub quiz question).  What relevance does any of this have for research grant applications?  Patience, Padawan, and all will become clear.

What I’m calling the “Star Wars” error in grant writing is starting the main body of your proposal with the position of “A long time ago…”. Before going on to review the literature at great length, quoting everything that calls for more research, and in general taking a lot of time and space to lay the groundwork and justify the research.  Without yet telling the reader what it’s about, why it’s important, or why it’s you and your team that should do it.

This information about the present project will generally emerge in its own sweet time and space, but not until two thirds of the way through the available space.  What then follows is a rushed exposition with inadequate detail about the research questions and about the methods to be employed.  The reviewer is left with an encyclopaedic knowledge of all that went before it, of the academic origin story of the proposal, but precious little about the project for which funding is being requested.  And without a clear and compelling account of what the project is about, the chances of getting funded are pretty much zero.  Reviewers will not unreasonably want more detail, and may speculate that its absence is an indication that the applicants themselves aren’t clear what they want to do.

Yes, an application does need to locate itself in the literature, but this should be done quickly, succinctly, clearly, and economically as regards to the space available.  Depending on the nature of the funder, I’d suggest not starting with the background, and instead open with what the present project is about, and then zoom out and locate it in the literature once the reader knows what it is that’s being located.  Certainly if your background/literature review section takes up more than between a quarter of the available space, it’s too long.

(Although I think “the Star Wars”  is a defensible name for this grant application writing mistake, it’s only because of the words “A long time ago, in a galaxy far, far away….”. Actually the scrolling text is a really elegant, pared down summary of what the viewer needs to know to make sense of what follows… and then we’re straight into planets, lasers, a fleeing spaceship and a huge Star Destroyer that seems to take forever to fly through the shot.)

In summary, if you want the best chance of getting funded, you should, er… restore balance to the force…. of your argument. Or something.

Six writing habits I reckon you ought to avoid in grant applications…..

There are lots of mistakes to avoid in writing grant applications, and I’ve written a bit about some of them in some previous posts (see “advice on grant applications” link above).  This one is more about writing habits.  I read a lot of draft grant applications, and as a result I’ve got an increasingly long list of writing quirks, ticks, habits, styles and affectations that Get On My Nerves.

Imagine I’m a reviewer… Okay, I’ll start again.. imagine I’m a proper reviewer with some kind of power and influence…. imagine further that I’ve got a pile of applications to review that’s as high as a high pile of applications.  Imagine how well disposed I’d feel towards anyone who makes reading their writing easier, clearer, or in the least bit more pleasant.  Remember how the really well-written essays make your own personal marking hell a little bit less sulphurous for a short time.  That.  Whatever that tiny burst of goodwill – or antibadwill – is worth, you want it.

The passive voice is excessively used

I didn’t know the difference between active and passive voice until relatively recently, and if you’re also from a generation where grammar wasn’t really teached in schools then you might not either.  Google is your friend for a proper explanation by people who actually know what they’re talking about, and you should probably read that first, but my favourite explanation is from Rebecca Johnson – if you can add “by zombies”, then it’s passive voice. I’ve also got the beginnings of a theory that the Borg from Star Trek use the passive voice, and that’s one of the things that makes them creepy (“resistance is futile” and “you will be assimilated”)  but I don’t know enough about grammar or Star Trek to make a case for this.   Sometimes the use of the passive voice (by zombies) is appropriate, but often it makes for distant and slightly tepid writing.  Consider:

A one day workshop will be held (by zombies) at which the research findings will be disseminated (by zombies).  A recording of the event will be made (bz) and posted on our blog (bz).  Relevant professional bodies will be approached (bz)…

This will be done, that will be done.  Yawn.  Although, to be fair, a workshop with that many zombies probably won’t be a tepid affair.  But much better, I think, to take ownership… we will do these things, co-Is A and B will lead on X.  Academic writing seems to encourage depersonalisation and formality and distancing (which is why politicians love it – “mistakes were made [perhaps by zombies, but not by me]”.

I think there are three reasons why I don’t like it.  One is that it’s just dull.  A second is that I think it can read like a way of avoiding detail or specifics or responsibility for precisely the reasons that politicians use it, so it can subconsciously undermine the credibility of what’s being proposed.  The third reason is that I think for at least some kinds of projects, who the research team are – and in particular who the PI is – really matters.  I can understand the temptation to be distant and objective and sciency as if the research speaks entirely for itself.  But this is your grant application, it’s something that you ought to be excited and enthused by, and that should come across. If you’re not, don’t even bother applying.

First Person singular, First Person plural, Third Person

Pat Thomson’s blog Patter has a much fuller and better discussion about the use of  “we” and “I” in academic writing that I can’t really add much to. But I think the key thing is to be consistent – don’t be calling yourself Dr Referstoherselfinthethirdperson in one part of the application, “I” in another, “the applicant” somewhere else, and “your humble servant”/ “our man in Havana” elsewhere.  Whatever you choose will feel awkward, but choose a consistent method of awkwardness and have done with it. Oh, and don’t use “we” if you’re the sole applicant.  Unless you’re Windsor (ii), E.

And don’t use first names for female team members and surnames for male team members.  Or, worse, first names for women, titles and surnames for men. I’ve not seen this myself, but I read about it in a tweet with the hashtag #everydaysexism

Furthermore and Moreover…

Is anyone willing to mount a defence for the utility of either of these words, other than (1) general diversity of language and (2) padding out undergraduate essays to the required word count? I’m just not sure what either of these words actually means or adds, other than perhaps as an attempted rhetorical flourish, or, more likely, a way of bridging non-sequiturs or propping up poor structuring.

“However” and “Yet”…. I’ll grudgingly allow to live.  For now.

Massive (Right Justified) Wall-o-Text Few things make my heart sink more than having to read a draft application that regards the use of paragraphs and other formatting devices as illustrative of a lack of seriousness and rigour. There is a distinction between densely argued and just dense.  Please make it easier to read… and that means not using right hand justification.  Yes, it has a kind of superficial neatness, but it makes the text much less readable.

Superabundance of Polysyllabic  Terminology

Too many long words. It’s not academic language and (entirely necessary) technical terms and jargon that I particularly object to – apart from in the lay summary, of course.  It’s a general inflation of linguistic complexity – using a dozen words where one will do, never using a simple word where a complex one will do, never making your point twice when a rhetorically-pleasing triple is on offer.

I guess this is all done in an attempt to make the application or the text seem as scholarly and intellectually rigorous as possible, and I think students may make similar mistakes.  As an undergraduate I think I went through a deeply regrettable phase of trying to ape the style of academic papers in my essay writing, and probably made myself sound like one of the most pompous nineteen year olds on the planet.

If you find yourself using words like “effectuate”, you might want to think about whether you might be guilty of this.

Sta. Cca. To. Sen. Ten. Ces.

Varying and manipulating sentence length can be done deliberately to produce certain effects.  Language has a natural rhythm and pace.  Most people probably have some awareness of what that is.  They are aware that sentences which are one paced can be very dull.  They are aware that this is something tepid about this paragraph.  But not everyone can feel the music in language.  I think it is a lack of commas that is killing this paragraph.  Probably there is a technical term for this.

So… anyone willing to defend “moreover” or “furthermore”? Any particularly irritating habits I’ve missed?  Anyone actually know any grammar or linguistics provide any technical terms for any of these habits?

Meanwhile, over at the ESRC…

There have been a few noteworthy developments at the ESRC over the summer months which I think are probably worth drawing together into a single blog post for those (like me) who’ve made the tactical error of choosing to have some time off over the summer.

1.  The annual report

I’ve been looking forward to this (I know, I know….) to see whether there’s been any substantial change to the huge differences in success rates between different academic disciplines.  I wrote a post about this back in October and it’s by some distance the most read article on my blog. Has there been any improvements since 2011/12, when Business and Management had 1 of 68 applications funded and Education 2 of 62, compared to Socio-Legal Studies (39%, 7 of 18), and Social Anthropology (28%, 5 from 18).

Sadly, we still don’t know, because this information is nowhere to be found in the annual report. We know the expenditure by region and the top 11 (sic) recipients of research expenditure, research and training expenditure, and the two combined.  But we don’t know how this breaks down by subject.  To be fair, that information wasn’t published until October last year, and so presumably it will be forthcoming.  And presumably the picture will be better this year.

That’s not to say that there’s no useful information in the annual report. We learn that the ESRC Knowledge Exchange Scheme has a very healthy success rate of 52%, though I think I’m right in saying that the scheme will have been through a number of variations in the period in question. Historically it’s not been an easy scheme to apply for, partly because of the need for co-funding from research partners, and partly because of a number of very grey areas around costing rules.

For the main Research Grants Scheme success rates are also up, though by how much is unclear.  The text of the report (p. 18) states that

After a period where rates plummeted to as low as 11 per cent, they have now risen to 35 per cent, in part because we have committed additional funding to the scheme [presumably through reallocation, rather than new money] but also because application volume has decreased. This shows the effects of our demand management strategy, with HEIs now systematically quality assuring their applications and filtering out those which are not ready for submission. We would encourage HEIs to continue to develop their demand management strategies as this means academics and administrators in both HEIs and the ESRC have been able to focus efforts on processing and peer-reviewing a smaller number of good quality applications, rather than spending time on poor quality proposals which have no chance of being funded.

Oddly the accompanying table gives a 27% success rate, and unfortunately (at the time of writing) the document with success rates for individual panel meetings hasn’t been updated since April 2012, and the individual panel meeting documents only list funded projects, not success rates. But whatever the success rate is, it does appear to be a sign that “demand management” is working and that institutions are practising restraint in their application habits.  Success rates of between a quarter and a third sound about right to me – enough applications to allow choice, but not so many as to be a criminal waste of time and effort.

The report also contains statistics about the attendance of members at Council and Audit Committee Meetings, but you’ll have to look them up for yourself as I have a strict “no spoilers” policy on this blog.

I very much look forward – and I think the research community is too – to seeing the success rates by academic discipline at a later date.

2. A new Urgency Grants Mechanism

More good news…. a means by which research funding decisions can be taken quickly in response to the unexpected and significant.  The example given is the Riots of summer 2011, and I remember thinking that someone would get a grant out of all this as I watched TV pictures my former stomping ground of Croydon burn.  But presumably less… explosive unexpected opportunities might arise too.  All this seems only sensible, and allows a way for urgent requests to be considered in a timely and transparent manner.

3. ESRC Future Research Leaders call

But “sensible” isn’t a word I’d apply to the timing of this latest call.  First you’ve heard of it?  Well, better get your skates on because the deadline is the 24th September. Outline applications?  Expressions of interest?  Nope, a full application.  And in all likelihood, you should probably take your skates off again because chances are that your institution’s internal deadlines for internal peer review have already been and gone.

The call came out on or about the 23rd July, with a deadline of 24th September. Notwithstanding what I’ve said previously about no time of the academic year being a good time to get anything done, it’s very hard to understand why this happened.  Surely the ESRC know that August/September is when a lot of academic staff (and therefore research support) are away from the university on a mixture of annual leave and undertaking research.  Somehow, institutions are expected to cobble together a process of internal review and institutional support, and individuals are expected to find time to write the application.  It’s hard enough for the academics to write the applications, but if we take the demand management agenda seriously, we should be looking at both the track record and the proposed project of potential applicants, thinking seriously about mentoring and support, and having difficult conversations with people we don’t think are ready.  That needs a lot of senior time, and a lot of research management time.

This scheme is a substantial investment.  Effectively 70 projects worth up to £250k (at 80% fEC).  This is a major investment, and given that the Small Grants scheme and British Academy Fellowship success rates are tiny, this is really the major opportunity to be PI on a substantial project.  This scheme is overtly picking research leaders of the future, but the timetable means that it’s picking those leaders from those who didn’t have holiday booked in the wrong couple of weeks, or who could clear their diaries to write the application, or who don’t have a ton of teaching to prepare for – which is most early career academics, I would imagine.

Now it might be objected that we should have know that the call was coming.  Well…. yes and no. The timing was similar last year, and it was tight then, but it’s worse this year – it was announced on about the same date, but with a deadline 4th October, almost two working weeks later.  Two working weeks that turns it from a tall order into something nigh on impossible, and which can only favour those with lighter workloads in the run-up to the new academic year. And even knowing that it’s probably coming doesn’t help.  Do we really expect people to start making holiday plans around when a particular call might come out?  Really?  If we must have a September deadline, can we know about it in January?  Or even earlier?  To be fair, the ESRC has got much better with pre-call announcements of late, at least for very narrow schemes, but this really isn’t good enough.

I also have a recollection (backed up by a quick search through old emails, but not by documentary evidence) that last year the ESRC were talking about changing the scheme for this year, possibly with multiple deadlines or even going open call.  Surely, I remember thinking, this start-of-year madness can only be a one-off.

Apparently not.

News from the ESRC: International co-investigators and the Future Leaders Scheme

"They don't come over here, they take our co-investigator jobs..."I’m still behind on my blogging – I owe the internet the second part of the impact series, and a book review I really must get round to writing.  But I picked up an interesting nugget of information regarding the ESRC and international co-investigators that’s worthy of sharing and commenting upon.

ESRC communications send round an occasional email entitled ‘All the latest from the ESRC’, which is well worth subscribing to, and reading very carefully as often quite big announcements and changes are smuggled out in the small print.  In the latest version, for example, the headline news is the Annual Report (2011-12), while the announcement of the ESRC Future Leaders call for 2012 is only the fifth item down a list of funding opportunities.  To be fair, it was also announced on Twitter and perhaps elsewhere too, and perhaps the email has a wider audience than people like me.  But even so, it’s all a bit low key.

I’ve not got much to add to what I said last year about the Future Leaders Scheme other than to note with interest the lack of an outline stage this year, and the decision to ring fence some of the funding for very early career researchers – current doctoral students and those who have just passed their PhD.  Perhaps the ESRC are now more confident in institutions’ ability to regulate their own submission behaviour, and I can see this scheme being a real test of this.  I know at the University of Nottingham we’re taking all this very seriously indeed, and grant writing is now neither a sprint nor a marathon but more like a steeplechase, and my impression from the ARMA conference is that we’re far from alone in this.  Balancing ‘demand management’ with a desire to encourage applications is a topic for another blog post.  As is the effect of all these calls with early Autumn deadlines – I’d argue it’s much harder to demand manage over the summer months when applicants, reviewers, and research managers are likely to be away on holiday and/or researching.

Something else mentioned in the ESRC is a light touch review of the ESRC’s international co-investigator policy.  One of the findings was that

“…grant applications with international co-investigators are nearly twice as likely to be successful in responsive mode competitions as those without, strengthening the argument that international cooperation delivers better research.”

This is very interesting indeed.  My first reaction is to wonder whether all of that greater success can be explained by higher quality, or whether the extra value for money offered has made a difference.  Outside of the various international co-operation/bilateral schemes, the ESRC would generally expect only to pay directly incurred research costs for ICo-Is, such as travel, subsistence, transcription, and research assistance.  It won’t normally pay for investigator time and will never pay overheads, which represents a substantial saving on naming a UK-based Co-I.

While the added value for money argument will generally go in favour of the application, there are circumstances where it might make it technically ineligible.  When the ESRC abolished the small grants scheme and introduced the floor of £200k as the minimum to be applied for through the research grants scheme, the figure of £200k was considered to represent the minimum scale/scope/ambition that they were prepared to entertain.  But a project with a UK Co-I may sneak in just over £200k and be eligible, yet an identical project with an ICo-I would not be eligible as it would not have salary costs or overheads to bump up the cost.  I did raise this with the ESRC a while back when I was supporting an application that would be ineligible under the new rules, but we managed to submit it before the final deadline for Small Grants.  The issue did not arise for us then, but I’m sure it will (and probably has) arisen for others.

The ESRC has clarified the circumstances under which they will pay overseas co-investigator salary costs:

“….only in circumstances where payment of salaries is absolutely required for the research project to be conducted. For example, where the policy of the International Co-Investigator’s home institution requires researchers to obtain funding for their salaries for time spent on externally-funded research projects.

In instances where the research funding structure of the collaborating country is such that national research funding organisations equivalent to the ESRC do not normally provide salary costs, these costs will not be considered. Alternative arrangements to secure researcher time, such as teaching replacement costs, will be considered where these are required by the co-investigator’s home institution.”

This all seems fairly sensible, and would allow the participation of researchers involved in Institutes where they’re expected to bring in their own salary, and those where there isn’t a substantial research time allocation that could be straightforwardly used for the project.

While it would clearly be inadvisable to add on an ICo-I in the hope of boosting chances of success or for value for money alone, it’s good to know that applications with ICo-Is are doing well with the ESRC even outside of the formal collaborative schemes, and that we shouldn’t shy away from looking abroad for the very best people to work with.   Few would argue with the ESRC’s contention that

[m]any major issues requiring research evidence (eg the global economic crisis, climate change, security etc.) are international in scope, and therefore must be addressed with a global research response.