Troll spotting – a beginner’s guide

Chris reflects on a swelteringly hot workshop in London over the Summer helping researchers make the most of social media. Here he takes one aspect of the workshop, dealing with trolls online, and highlights some possible reasons for it and mitigating steps.

black and white picture of a troll emerging from the ground

Last week I was in the virtual oven that was East London delivering a workshop to a lovely group from UEL‘s institute of Health and Human Development – a minor miracle that we got through the day considering I wilt like spinach in the heat!

I was down to help them with their use of social media. It’s a new workshop designed just for them but on reflection would suit any team that is looking to enhance their use of social media for things like public engagement, collaboration and so on. As such we think it would help research teams, libraries, student welfare teams and so on. I’m not going to go through it in detail but I’ve put an example agenda at the end of the post – something we could adapt to your situation.

What I did want to break down was the short section on dealing with online trolls or any form of abusive online behaviour. It wasn’t a comprehensive guide about how to deal with negative online behaviour, more a facilitated discussion to answer two key questions:

  • Is it possible to identify what things are likely to attract trolling?
  • What can you do to prepare to reduce its likelihood and what can you do to respond to mitigate its impact?

What might attract trolling behaviours?

The first thing is to try and address some assumptions. Work out where the line is on what might be robust, academic debate and actual, abusive trolling. It’s not always a clear distinction, sometimes to do with the intention of the person posting, sometimes the valid interpretation of the person receiving the communication.

It’s also important to recognise that social media isn’t a homogenous safe space. As a straight, white, middle class, affluent male I pretty much tick all the privelege boxes so being on social media has never been a truly threatening place for me. If however your identity doesn’t fall into those categories, life online can be much harder.

That’s important when looking at social media strategy and dealing with abuse because it tends to come in two flavours: Abuse about identity and abuse about content. Some groups suffer disproportionately because of their identity; look at the experiences of Gina Millar or Mary Beard as a starter.

There’s crossover between abuse over identity and content. Gina Miller is deeply engaged in politcal debate which makes her a public figure but you only have to look at some of the most egregious abuse that she suffers to see that the fact she is a woman of colour is the main reason she has become such a target.

From the content angle, if we’d had more time in the workshop I would have worked with the team to identify the topics and keywords that they were likely to be posting about and engaging with online. There’s a follow-up exercise to identify which of the topics might get negative responses – not as a way of identifying topics to avoid, more you can take necessary steps beforehand if required to mitigate. One thing I do outside of work is support a local campaign group for pedestrians and cyclists. Cycling especially is a topic that seems to spark furious argument and abuse regardless of the identity of who it’s aimed at.

Preparing for and responding to trolling

I’m not going to share all of what the group said in this exercise as this was very much to do with their context but there are a few general nuggets worth sharing:

  • It makes a big difference if you are using an institutional, anonymous account to engage with topics online. Particularly if you feel using your own personal identity leaves you too vulnerable.
  • Set the tone. There’s a limit to how much you can influence behaviour but you can at least model the sort of behaviour you’d like to see, for example using data and reason in discussions and arguments.
  • Make the choices you feel comfortable with. You may have heard the phrase “don’t feed the trolls”, in other words ignore the negative behaviour as responding is likely to generate further ire. This is perfectly valid. There are other approaches though, see for example Reading University’s response to criticism of its scholarships for refugees a while back which was pretty assertive and generated a lot of support from the community. The point is to choose what you feel most comfortable doing.

  • Consider your own responsibilities. If you feel confident on social media that’s fantastic, but remember that not everyone shares this. If you’re working as a team either on an institutional account by combining personal ones it’s important to not put pressure on people to behave in a certain way if it’s not authenticall them. Likewise, it’s reasonable to push back if you are the one feeling under pressure to behave in a way that’s not you.
  • Don’t work in isolation, particulalrly if you’re feeling vulnerable. Back each other up, either offline or on. Listen to concerns and offer support when needed. Also, if you’re part of a larger organisation, chances are that your comms team for example already has guidelines or offers support in this area. Seek it out.

One lesson I learned was that sometimes something that looks like trolling behaviour, isn’t. One participant told the story of a member of the public that had been engaging with research work they had been doing but had ended up demonstrating behaviours suggestive of emotional  reliance on the researchers. We didn’t go too deep into the example (I hope I’ve remembered it right!) and I’m no psychologist but you can imagine situations where some vulnerable people may seek validation in online interaction and become reliant on it. It may not be abuse but as an organisation, what would you do in that context to ensure that the person is sensistively handled and ends up up getting the help they need?

Is that it?

Absolutely not! Dealing with any sort of human behaviour is difficult and when it’s online it becomes particularly complex, so not all suggestions work in every context.

It would be great to hear examples of where you or your team has experienced this and what you did in response.

To see how this topic fitted into the whole workshop, check out the full session outline. If that looks like something you’d like us to come and do for us, talk to your Jisc account manager or email us direct on consultancy@jisc.ac.uk.

Image CC0 by Tama66 on Pixabay

 

By Chris Thomson

I'm a Subject Specialist at Jisc focusing on online learning and digital student experience.

Leave a Reply

Your email address will not be published. Required fields are marked *