LoveBomb Part 4 – A Special Investigation

Imagine a war fought in cyberspace. A digital conflict designed to replicate a computer virus, which attacks the mind.

Malware for humans.

This is not fiction. Not a future dystopia. This is reality. Our reality of today…

This is part four of LoveBomb, a special investigation* into weaponised data and disinformation, social media, and counter offensives. It is recommended you read the previous instalments before this one.

*The format is, in of itself, experimental and includes multiple formats which will be refined over the course of the investigation.

Previously:

Part 1.

Part 2.

Part 3.

Part 4:

https://youtu.be/gLAroHJNKkc

One of the things I’ve been looking at closely is the way we fight back, the way we’ve been targeted on social media, and why it is so effective.

We now understand how you can be profiled without big data. We understand how powerful such a portrait could be if all of your personal data was gathered.

But how do trolls work?

They function as a signal amplifier, helping to push a disinformation message, whether they be artificial intelligence or managed accounts.

One of the things I’ve discovered is the psychology of using figures who look like you, who look like figures you respect. The other aspect of this, is how trolls harness the power of hashtags and the power of using them to spread disinformation.

Troll 101:

The truth is, trolling is an industry and fast becoming an exact science. An economy all of its own.

It was only in March 2017, after it was too late, that the Ranking Democrat member of the House Intelligence Committee, Adam Schiff, told CNN the committee was investigating whether the Donald Trump campaign coordinated with the Russians to spread “fake news” through trolls and “bots” online and sway the election.

“We are certainly investigating how the Russians used paid media trolls and bots, how they used their RT propaganda platform to disseminate information, to potentially raise stories, some real some not so real, to the top of people’s social media,” Schiff said.

In many ways, a little historical digging makes sense of not only bots, but a lot of the alternative outlets spewing conspiracy theories.

The Russian state were sponsoring ‘Web Brigades’ as far back as the 1990s, paying around 80 Rubles a comment for people to spam the internet with false information, not to convince people but to confuse them. To create distrust in all media. They were also paying high profile bloggers, which has made me think about sites like Info Wars and Prison Planet in an even darker light.

The troll army is quite real and active.

Trend Microsystems have also released an incredibly detailed research paper on Fake News and how it can be spread by these deniable assets through the cash-driven assistance of an underground network of privateers.

The report makes no bones in explaining exactly how useful disinformation is, and how it works within the same sphere of influence as the big data and psychometric tactics utilised by companies like Cambridge Analytica. By “manipulating the balance of how a particular topic is reported (whether that concerns politics, foreign affairs, or something more commercial),” the report says, “the views on that topic can be changed. This can be done either with inaccurate facts or with accurate ones twisted to favor a particular view or side.”

According the analysts at Trend Micro, social media posts also have to attract the target readers of their operation. “To do this,” Trend says, “the fake news posts are crafted to appeal to its readers’ psychological desires—confirming biases, the hierarchy of needs, etc.”

In one of their latest research papers, the scale of Russian penetration into Western democracy via cyber attacks leaves little doubt that we are in deep trouble and were caught looking the other way.

We also know from investigations by The Times and through Russian whistleblowers, trolls have been deployed masquerading as Americans and UK citizens, driving the debates around Trump and Brexit.

But what does this look like in practice? A lot like this. Tens of thousands of times over.

https://youtu.be/SEy8mE50yXU

The Hamilton Score:

Using the principles of Network Centrality, (see Part 1) trolls spread disinformation across our networks, even where we don’t intend to help them. They don’t even need many followers to function because, aside from them existing in thousands, a message can be spread with a very small “Organic Reach” if it is targeted at the right audience.

Organic Reach is the number of people who would naturally be exposed to whatever message you were putting out, such as this – which also uses some of the marketing and psychographic techniques discussed in Part 2.

https://twitter.com/LibertySTRATCOM/status/915244418319421441p>However, when you start to use hashtags Organic Reach becomes something else entirely.

Hashtags work as an amplifier for trolls by bundling their efforts together in a way which enhances content popularity within the whole network they are targeting.

The true meaning of this is horrifying. It means the actions of a single troll account, which would only be visible in limited circumstances is seen across an entire network. In turn this harnesses the full power of Network Centrality, amplifying the message again.

In terms of Facebook, this is the reason five Russian accounts were able to reach millions of people.

The Hamilton Dashboard monitors a selection of a few hundred Russian troll accounts and what they are tweeting about using hashtags.

By using Hamilton trends in the right way, at the right time, you can take an Organic Reach of 387 impressions and 26 engagements and amplify that from a low follower account like @Liberty STRATCOM, and amplify your message so it reaches 1238 impressions and 82 engagements.

Repeating this test a few times has allowed a basic calculation to be made. I have called this the Hamilton Score.

Using their real-time analysis of trending hashtags, each troll account multiplies its Organic Reach by at least three.

https://youtu.be/e8KBvDkfveA

The True Value of Trolls:

Taking France alone as an example, where 30,000 fake Facebook accounts – trolls – were deactivated during the Presidential race, the true value of Russia’s troll army starts to become clear.

If we were to hypothesise that Twitter hosts 30,000 co-ordinated troll accounts, each with an Organic Reach of 100 impressions and 5 engagements per tweet, this gives us a total, organic reach of 3,000,000 impressions and 150,000 engagements on each single message they push.

Applying a Hamilton Score to this means a troll army of 30,000 can, in fact, potentially reach 9,000,000 impressions and 450,000 engagements with every single message they tweet.

Trolls are not just tweeting once a day, but at least five times an hour.

Over twenty-four hours, an army of trolls 30,000 strong could have the capability of reaching across a network and generating 1.08 billion impressions and 54 million engagements per day.

Now imagine this army is 60,000 trolls. 100,000 trolls. 500,000.

Then apply to this our understanding of the power of Network Centrality and Psychographics. 

This is is how disinformation is being pushed. This is how the topics they spread have entered daily conversation, spreading beyond the boundaries of social networks and entering into daily discourse.

This is the digital blitzkreig.

Even in this, however, there’s hope…

Next In Series:

The next articles in this series looks at the design of benign information using the profile information gathered on disinformation actors and explores how Network Centrality can be used in an ethical campaign against them. The series delves into Paid Reach, with explosive results.

You can support this series and the ongoing work behind it here.