Social Media Engineering

A previous article, the second in last year’s series, spoke about social engineering as a way to manipulate individuals into disclosing information which would benefit an attacker.

I’m sure recent events have been obvious to everyone, and that most readers will have been aware of some of the conspiracy theories, misinformation, and polarisation which is occurring. While this is, in part, down to the way that certain social media platforms are designed to focus on division, some of it is down to deliberate recruitment techniques.

These techniques are universal, and are often used to recruit people into personality cults, multi-level marketing schemes, conspiracy theories, and various other radical groups. They are low-cost, low-effort, incredibly effective, and even more effective in an environment where most social interaction has been moved online and face to face is discouraged. They are used to help spread conspiracy theories which target and encourage action against critical national infrastructure, and high-profile persons.

To take just one example the conspiracy theories around 5G were quickly re-tailored to incorporate Bill Gates and George Soros as targets, and were effective enough that there were arson attacks against 5G infrastructure. In other words, a few memes and videos, along with the use of these techniques (whether deliberate, or accidental, but there is strong evidence of deliberate action) were enough to cause homegrown domestic terrorism attacks on UK soil.

While not part of what might normally be considered cyber security, social engineering is usually included, and a lot of these techniques exploit botnets and catfish accounts (fake social media accounts designed to present an appealing persona in order to infiltrate social networks) to encourage their spread. This technique, and others like it, show the weaponization of social media.

Recruitment StepsClassically recruitment techniques used by cults or similar rely on four basic steps. We’ll do a quick run-through of these before looking at how they can be modified to thrive on social media platforms.

  • Target selection – the right target is often someone who has suffered a crisis. Any identity-threatening crisis will do, grief, significant life change, global pandemic, all prime someone as a potential target as people often reassess identity following significant change. Graduates, and recent arrivals at universities, are often favourites as these are very public, very obvious events which will prompt identity reassessments.

  • Love bombing – simple and straightforward positive reinforcement. The recruiting group showers the target with compliments and support, simply being nice. This support is unconditional, with no caveats or constructive criticisms, and when it works it positions the group as a best supportive friend.

  • Isolation – after the supportive position has been established, the target is isolated. This is one of the areas that adapts well to social media, and we’ll see why shortly. At this point with the target free from outside influences, a whole slew of emotional manipulative techniques can be applied to transition the group from a supportive best friend to a core part of the target’s identity.

  • Maintaining control – this step most replicates abusive relationships, with the group being a source simultaneously of support and terror. The threat of withdrawing the support allows much more obvious use of abusive tactics, and by this point if the other steps have been successful the target will have started to think of themselves as a member of the group, as a core part of their identity. Breaking out, as with any abusive relationship, is difficult, painful, and usually requires outside help. We’ll see how this plays into social media engineering as well.

Translating to Social MediaIn person these techniques are clearly recognisable to anyone familiar with this sort of recruitment. Even some less-ethical sales books promote similar methods (multi-level marketing companies are founded on this sort of psychological manipulation). Where it can be less obvious is when they’ve moved to social media, where the steps are somewhat different.

Target SelectionWith the breadth of social media, targets often self-select. A few years ago it was established that 64% of members joining extremist groups on Facebook, of all kinds, came from recommendations from the Facebook algorithm. You may have come across the term ‘dog-whistle politics’, meaning phrases or ideas that seem innocuous, but carry greater meaning for a particular audience (if you read the article on steganography, dog-whistle politics is a form of steganography making use of what’s known as a context code).

By interacting with the post, whether responding to the hidden message or not, someone is marking out that they are a potential target. Because these techniques require a much lower investment of time and effort on social media than in person, a much wider net can be cast and so targets can be selected with less (or no) care beyond clicking ‘Like’.

Love Bombing, Isolation and Maintaining ControlThe social media approach combines these two into a single step. Whether it’s through a dog-whistle post of some kind (often a meme), through sharing news from extremist and unreliable sources or groups, or through some other mechanism, the aim is to get the target to post something controversial themselves. At this point people who are aware of the dog-whistle context and disagree with it will often react in a knee-jerk fashion with shock and anger. Not only because they disagree with the view, but because of the psychological principle that everyone considers themselves at the centre of a story – so a post that’s ideologically against a person feels like a personal betrayal.

Of course that anger leads to a reaction from the poster, who feels attacked because they meant no harm. Compounding this the group who originally created the innocuous post will jump onto the conversation, giving the unconditional support and acclaim, while friends and family are more likely to react with negative emotions. This is devastatingly effective – trimmed back to the most shallow level social media is built for instant, knee jerk responses rather than nuanced discussion. Emotionally weighted reactions can very quickly alienate a target from their friends and family, while simultaneously giving them a supportive new network among the radical group.

This reoccurs as the target will tend to then share more controversial posts – both as they get drawn into the group and to upset former friends and family in revenge for their perceived attacks. It merges perfectly from this stage into the keeping control stage, as a person is rejected by wider networks for extreme views they are driven more towards those views.

CountermeasuresThe best countermeasure is awareness and education on how these techniques work. Another trait is that people do not enjoy the idea they are being manipulated, and effective education works. Once a target has begun re-identifying as part of one of these groups it is a much more challenging and complex problem, and while deradicalization programmes exist they are very much still in their infancy.

Social Media EngineeringBy: James Bore 

James Bore is a Jack of all trades in cyber security by vocation and choice. He has been speaking and writing to build awareness of cyber security for several years, and now provides consultancy to companies looking to mature their own security to better protect their customers and employees. His hobbies include lockpicking, beekeeping, and homebrewing and in occasional spare time he maintains a blog at

The post Social Media Engineering appeared first on Circuit Magazine.


or to participate.