Seems like magic right? This process of deciding whether an account is a managed troll. 

The truth is, it isn’t and the time has come that we talked about the complexity of identifying the trolls in our social networks.

“This combination of factors prompted me to run a case study experiment

and see what came back. Also because people have been asking for an easy

way to identify trolls.”

Firstly, a troll is not a bot.

A bot is an automated account which is programmed to respond to keywords or to retweet. They have no human involvement aside from the programmer. They can also be readily identified by many services like Botometer.

A troll, however, is an account manned by a human and subsequently they will more than likely pass undetected. Many of the accounts I’ve been investigating over the last few days are troll accounts and not bots. Identifying just one is a laborious and complex business.

Thankfully, a case study popped up in my timeline this morning, responding to yesterday’s thread and article about Russia’s Brexit trolls.

So there I was, having a coffee and minding my own business when this tweet popped up. A really odd mishmash of EU flag chopped from another image, number combination in the name, horrific use of English, and pro-Russian posture.

What struck me, however, was “Wot”. In almost a year of daily interaction with Scottish Twitter users, I can hand on heart say I’d never seen one of them use “Wot”.

This combination of factors prompted me to run a case study experiment and see what came back. Also because people have been asking for an easy way to identify trolls.

So, was this just an unpleasant individual, or was it a fake account?

As with everything troll related, it’s complicated.

First of all we have no information on this person to go on, barring a Twitter alias. However, even starting with nothing always leads somewhere.

Through a handy piece of analytical software which integrates with Twitter’s API data, that one piece of information soon linked @Didgery77332nd not only to their current Twitter account, but to their previous Twitter account, @Didgery7733, and straight away it was possible to establish they are one and the same by mutual follows.

In fact, there are 388 twitter accounts 

(accounting for followers and follows) linked to Didgery’s two personas in 537 ways.

So, yes, it’s the same person.

What’s also interesting is the organic growth of the account over the two time periods. While a number of common connections remain, there have been substantial changes in follow patterns over both iterations.

Taking a closer look at the current persona, it was also quite clear that they tweet at some volume compared to their actual following – which is a pre-indicator of a troll account.

So, with the warning signs all there, a sample of the timeline starts to shape a picture of troll behaviour.

Smoo, as the account calls itself, doesn’t really talk about the things many others are discussing in Scotland and this goes back to 2015 when the account was opened.

In the first iteration of Didgery, all appears more peaceful at a first glance.

In fact, they used to be an even more overt retweet account for Russian memes and what we now know to be Russian and Russian-inspired disinformation – during the indyref period and up until they opened their new account.

So, there’s a pretty convincing case Didgery is indeed a Russian troll.

But, like most troll accounts, they are seeded with local information, which makes them very hard to distinguish. So, we can’t just stop there and say for sure that Didgery is a Russian troll.

In addition to “Wot,” however, a further language pattern in the account increases the likelihood that Didgery is not, in fact, a Scot.

Their use of the word “Way” to replace “With” is not Scottish. In fact, having confirmed this by consulting a broad spectrum of Twitter users from across Scotland, “Way” would almost certainly not be used. Rather, “With” would be substituted with “Wae” or even “Wi”.

N.B. One of the ways troll farms study native languages is by watching local media to the target area and then applying phonetics when writing subsequent content.

However, there was additional, localised material in the Didgery timeline, such as this picture and tweet, dated the 17th of January 2013.

Establishing whether this is legitimate proof of identity is again not straight forward, but rewarding nonetheless: a reverse image search shows the picture was lifted from a football forum, having been posted several days before.

Identifying a troll account is, as you can see, incredibly difficult and has to often be based on the balance of probabilities.

In this case study, the account features a number of classic indicators, as well as the right messaging and off language, and was combined with potentially cut and paste photographs.

On balance, at the higher end of the probability scale, it was a fair assessment that this account appears to be a foreign-based troll pushing Russian messaging.

But you have to make up your own mind, that’s the thrust of all this. And the danger.

When challenged, their reply was as follows:

The complexity of identifying troll accounts is a long overdue discussion and, as yet, Twitter has no reporting function which is specifically aimed at trolls. For now, they are mostly reported as spam or hacked or both.

As further food for thought, I’ll leave you with the tweet from Sputnik last night which resulted in my timeline being filled with a number of accounts like Didgery ever since.

In the mean time, and in short: when it comes to trolls you are going to have to make your own judgment calls until Twitter gets its act together.

https://twitter.com/J_amesp/status/929795968182116352

After this article was published an alleged owner of the accounts approached a Scottish media outlet and they in turn informed Byline that the person who approached them is Scottish. Ignoring the context of this original article, the story was picked up by Russian state broadcaster RT and even the Russian Embassy as evidence Russian trolls and troll farms do not exist. 

Over the months which followed thousands of accounts belonging to Russian state-sponsored troll farms, which had spread disinformation during the Trump election and the Brexit vote were uncovered. A number of Russian nationals involved in these troll farms have since been indicted in the US and Russia has been accused by multiple nations, including Britain,  of engaging in hybrid information warfare. 

Known disinformation actors and pro-Kremlin media outlets and commentators continue to misrepresent the content of this article with the aim of discrediting anyone who challenges pro-Kremlin disinformation and downplaying Russia’s ongoing information war.