Friday 25 February 2022

Do Not Feed The Trolls

"Don't feed the trolls. Nothing fuels them so much" - Oscar Wilde

The likelihood of Oscar Wilde ever having said that is extremely remote but it's all over the Internet which just proves that the Internet is rarely a reliable source of information. But, nevertheless, whoever did say it (or make it up) is on to something.

When I woke up this morning and looked at my Facebook feed I saw two shares from left leaning friends and acquaintances. One was some bullshit Tweet from Laurence Fox and one was a video, which of course I didn't watch, of Nigel Farage bloviating jovially on something or other. Of course, they'd both commented on them to say how ridiculous they are but it didn't matter. They'd shared Fox and Farage for all their friends and contacts to see.

They'd done their work for them. It made me realise just how topical last night's Skeptics in the Pub - Online talk, 'How ideas clash on social media' with Imran Ahmed, had been. Although the Russian invasion of Ukraine earlier that morning may have been an even more important reminder. Not least bearing in mind that the likes of Trump and Farage, both essentially Internet trolls, have come out in support of Putin and against democracy.

Hosted by the estimable Michael Marshall, Ahmed (the founding CEO of the Center for Countering Digital Hate and a recognised authority on the social and psychological dynamics of social media) began his story into the murky depths of Internet hate by telling of the six years he worked at the UK Parliament (Ahmed was born in Manchester but he works in the US now - hence Center and not Centre) for Hilary Benn, at that time the Shadow Foreign Secretary.

During this time, Ahmed noticed a huge rise of virulent antisemitism on the (hard) left and an equally large increase in Islamophobia on the (hard) right. If that wasn't worrying enough, on the 16th June 2016 one week before the Brexit vote, his colleague Jo Cox, the Labour MP for Batley and Spen, was murdered by the far right ideologue Thomas Mair.

As Mair shot and stabbed Cox to death he shouted "Britain First" and "death to traitors". Not long before this Ahmed had downplayed concerns about the news that the openly fascist Britain First were the first political party in the UK to get over one million followers on Facebook. Like many of us, Ahmed had not seen how online hate easily bleeds over into real world actions.

Something that seemed to have reached its apex with the Capitol insurrection in Washington DC on 6th January last year - but, more likely, hasn't yet. On social media, 'facts' become malleable. Amplification and visibility don't come from knowledge, as with other forms of media in the past, but from driving emotional reaction. When you hate tweet an anti-vaxxer, or call them an idiot or a moron - as I have done in the past, you give them the whip hand.

Like a retraction of a lie printed in The Sun, our comments about these hatemongers are written in much smaller text than the lies and hates they push out there, and it is human nature for us to start giving more credence to things we see, or hear, more often. Trolls, bad faith actors, and chaos merchants who get off on causing pain, realised this pretty quickly - and they gamed it.

Best off all, it cost them nothing to do so. You can tweet for free and it doesn't matter what kind of response you get as long as you get a response. In many ways being called an idiot, a moron or worse is better than being agreed with it because it drives further emotional responses and pushes them up in the algorithms and thus raises their profile.

So, with that in mind, best to say something really transgressive. While attending, for work - not as a participant, a Klan rally in Portland in 2020, Ahmed noticed a suggestion going round that Klan members should try to catch the then new disease of Covid and then go and spit on a black person. 

Covid, of course, was a gift to trolls and spreaders of hate. Research carried out by Ahmed and his team has revealed that a small group of leading anti-vaxxers had made the decision to take up that position before any vaccines had even emerged. For a complicated number of reasons, but one of them being that anti-vax would be divisive and therefore their profile would be raised.

They would feel heard in a large and confusing world, perhaps, if you want to be sympathetic. The giant social media companies were urged to take action on this, and many other issues, and of course, small tweaks aside, they failed. These companies have proven, consistently, incapable of getting their houses in order.

To take action costs money. To do nothing is free - and even results in increased profits. Imran Ahmed's idea was that social media companies should, instead of having rules, have rights that protect their users, specifically those that are on the receiving end of online abuse. It seems a drastic yet workable solution to a situation where we have created spaces that are unsafe for large sections of society.

Silicon Valley dudes like to present themselves as the good guys and many of us want to believe they are - but like most other 'hippies' we know they're not really. The Center for Countering Digital Hate (CCDH) did a little experiment into the algorithms that Instagram use and made some quite disturbing, though depressingly unsurprisingly, finds.

In the past, after an amount of time of scrolling Instagram you'd reach a message that read "you've caught up". That was not driving more interaction and therefore that was not driving profits. So instead it was changed and now you receive 'recommendations' and these 'recommendations' make Instagram an awful lot of money.

But what are they? CCDH set up an Instagram account that expressed an interest in 'wellness'. The recommendations that came up were for anti-vax accounts, the anti-vax accounts led them to antisemitic accounts, and the antisemitic accounts led them to QAnon accounts. It was a downward spiral into the world of conspiracy theories and online hate and though many, of course, can see this immediately and resist, others are pulled in.

Once in a world of conspiracy theories, you'll find there are a lot of things that still don't make sense (no shit!), a lot of holes in the theory. Again, some will pull themselves out but many others will, instead, go deeper into these fringe ideologies and theories, many of them hateful, most of them demonstrably untrue, and some of them completely fucking batshit.

This has always happened but in the past you had to work at getting yourself in to these worlds or have a supposed friend lead you there. Now, Instagram does that for you. The social media giants like to say they're a mirror to society but if they are they're a highly distorted mirror that belongs more in a fairground or a freak show than anywhere else.

Frances Haugen, the American data engineer and Facebook whistleblower, corroborated CCDH's findings and confirmed to them that if the algorithm was changed to make it safer for users then people would spend less time on the platforms - and, bottom line, the platforms would lose money.


In the 1970s, in the US, many Ford Pinto cars were discovered to have faulty fuel tanks that were causing deadly fires. Initially, the Ford motor company began to recall Ford Pintos before a study found that it was cheaper for them to pay compensation to dead victim's families than to recall ALL the cars - so they did that instead.

While it seems a stretch to say that Twitter, Facebook, Instagram etc; would rather their customers die than see a dip in profits it's, sadly, not that much of a stretch. Social media algorithms have been weaponised by antisemites, Islamophobes, and, most successfully of all, the Russian state to drive hate and division and to try and roll back the advances we have made in science and in tolerance over the last decades and centuries.

Almost to overthrow the ideas of the Enlightenment. Which makes Internet trolls fellow travellers with ISIS or the Taliban. All of this, of course, is unremittingly bleak but Imran Ahmed didn't want his talk to be of the "we're all going to Hell on a handcart" nature so he did offer what he felt were possible solutions to this problem.

Or at least ways we can change and hope to inspire others to do so as well. Primarily, resist interactions with liars, bad actors, and those we deem 'idiots' but seem to be able to use the Internet more effectively than us. Stop feeding the trolls.

At the moment our interactions with these people are almost always counterproductive. Fact checking is of limited use because these people don't care about facts. It's so obviously true the world isn't run by a cabal of lizard people we shouldn't have to waste time arguing with people over that. The trick of the trolls is to throw so many lies out there you could never possibly fact check them all.

You simply wouldn't have enough time. There will always be 'polluters' on the Internet. The trick is not to give them a larger platform. Instead, amplify the words, posts, and tweets of the good guys. Share their posts, not the baddies.

Another thing that would help is if the social media behemoths were forced into having greater transparency. Not just into their regulations but into how those regulations are enforced as well as into their algorithms and their economics.

One last idea is one we could all do with trying. Humility. Stop calling other people stupid. Ahmed did a little test and asked what vegetable helps our night vision. Most people would think 'carrot'. But that's not true at all. During World War II, the RAF propagated that myth to explain why their pilots had improved success during night air battles. Far better than telling the enemy about recent advances in radar technology.


You may have already known that. You may not. Either way you're not stupid. If you see someone spreading a lie, ignore it, block them, report them but don't share it and don't argue with them. Ahmed's talk had been fascinating and a Q&A that took in the likes of Trump, Putin, RFK Jr, Katie Hopkins, Nick Clegg, Rio Ferdinand, and Sacha Baron Cohen (I think you can work out who the good guys are amongst that lot) as well as YouTube, TikTok, and Telegram proved both amiable and educational.

One interesting fact that appeared was that until very recently Facebook did not employ one single moderator who was fluent in the Ukrainian language. These sort of oversights, attempts to cut costs, could lead to all manner of disinformation being spread and who knows where that might lead us. To war?





No comments:

Post a Comment