Artificial intelligence, profit-hungry algorithms, bot armies, and of course the age-old human tendency to favor fiction over fact – these have intertwined to present humanity with one giant headache, as it strives to discern truth from falsehood.
Social media is now widely regarded as a major culprit when it comes to fueling the spread of disinformation, conspiracy theories, and fake news, with the Maui flashfire disaster in August shedding glaring light on the trend.
But Twitter, Facebook, and other social media platforms are not the only contributing factors. They aren’t even the only big tech players responsible: we live in an age where Google has us believing we’re all experts, as presciently noted by tech blogger Nicholas Carr years ago.
All too often nowadays, genuine experts are dismissed or even derided in favor of the far more sinister amateur sleuths and other social media influencers peddling lies or at best half truths for attention, soft power, or money.
Not only that, but now tech proper has entered the game, with bots being used to scale up fake posting on dummy accounts and create the illusion of mass followings and other veneers of legitimacy, while artificial intelligence is increasingly used to generate convincing deepfakes and narratives to underpin conspiracy theories.
And don’t expect much help from Facebook or Google: they are only too happy to keep algorithms that rank fiction above fact ticking over. The reason why is painfully obvious: placing popularity over accuracy equals profit, because doing so reflects the desires of internet users and therefore gives Big Tech even more access to their data.
To put it another way, most people buy into conspiracy theories because they find them more thrilling than dull workaday facts, meaning more of their data can be hoovered up and sold as they click eagerly on the latest digital distortion of the truth. Once again the Big Tech model proves a disappointment, as greed trumps knowledge.
This is the smorgasbord of chaos we attempted to digest in our latest instalment of Through A Glass Darkly, where we do our best to unpick complex tech issues. In this episode we discussed the following concepts around the internet and conspiracy theories:
- Which came first, the chicken or the egg: are people drawn to conspiracy theories because they spend more time on social media, or are they predisposed by their own distrust of official sources to seek out conspiracies on platforms like Twitter where they are more widely available?
- Terms like “conspiracy mindset”, “conspiracy thinking”, and “conspiracy mentality” have entered the vocabulary of academics as they try to pinpoint what gives a conspiracy theory, be it the Maui fires and Oprah Winfrey or QAnon and Pizzagate, its persistence and popularity in the modern era.
- The concept of the “illusion of knowledge”: how the Google era has so many of us kidding ourselves into thinking that we are experts when we aren’t, and how this is manna from heaven for the conspiracy theorist or propagandist.
- Bots vs. humans: which is the more effective way of spreading propaganda, conspiracies and other falsehoods, and why? We also touch on the so-called hybrid model, where human trolls are marshalled into so-called factories in other countries, paid minimum wage, and taught how to deploy bot accounts on social media to divide and disrupt people.
- The entry of AI into this sphere: will it eventually replace human disseminators of conspiracy theories altogether? And what can be done to stop it augmenting bot and human efforts to spread disinformation?
- Whether and how to regulate Big Tech to stop this phenomenon wrecking our so-called digital town square (and possibly our democracies too)? And more to the point, who precisely gets to do the regulating – will we be able to elect this body and how answerable will it be to us?
As one might guess from the above list, this was one of our more unsettling discussions. Trying to debunk conspiracy theories is harder than some might think: simply presenting someone with a fact-based article or video offering, say, a scientific explanation for an optical illusion that made the Maui fire look like a ray gun from space won’t necessarily work.
The reason why is because of something called “anchoring bias” or “primary effect”: the first piece of information a human being is given about a situation or event becomes the anchor or foundation for all their subsequent opinions on the topic. Leaving tech aside briefly, this has already been observed as an unwelcome effect on juries in court cases and the like.
Essentially what it means is that when it comes to winning somebody over to your view of events or line of argument, it’s all too often a case of “first come, first served,” regardless of facts or expertise.
We suggest that this also appears to feed into a deeper problem of perception, namely that trusting an expert is increasingly coming to be viewed as being in tension with democracy: in today’s world, everyone’s opinion should count equally, shouldn’t it?
The problem with this feelgood take on equity in knowledge is that it simply doesn’t add up in reality: for instance, as an armchair fan of tennis, I’m likely to know far less about how to play a good match than a professional coach with 20 years of experience under their belt. But to point this out risks being dubbed an elitist and, by implication, one of “them,” serving hidden powers to keep people from the truth.
This disposition of skepticism and mistrust of authority can of course be laudable, even healthy: governments and the so-called establishment can and do lie or cover things up when it suits them to do so. But transplanted to the high-tech world of the 21st century, the phenomenon of conspiracy thinking, we try to argue, paves the way for some far darker masters to take control of our lives.
Putting it another way once again, what we increasingly have in place of experts are self-appointed tech gods like Elon Musk and Mark Zuckerberg, demanding they be left by governments to self-regulate as they pursue their profit-based mission with a vengeance, truth and knowledge be damned.
That’s the top of the paradigm, but follow it down to its base and you’ll see something perhaps even more unpalatable: a legion of self-appointed amateur sleuths, wannabe pundits, and zealous propagandists – who may well be using bot accounts themselves, sometimes referred to as “sticky followers,” on platforms like Twitter to create the illusion of expertise and authenticity. Groupthink leveraged by way of digital manipulation, if you will.
What this adds up to is vanity – remember Al Pacino’s devil character dubbed it his favorite of all sins way back in 1997? – on a mass scale; as scientists, engineers, historians, and other researchers who have quietly worked hard for decades to master their field are overlooked in favor of influencers and other digital blowhards who probably wouldn’t have looked out of place thumping a tub at a rally in Nuremberg in the 1930s.
Alright, so maybe we didn’t get quite that strident during our podcast – but hopefully this latest 50-minute instalment will give folks something to ponder. And cause to think twice before immediately believing the deepest, darkest story that gets fed to them online.
What does “through a glass darkly” mean?
While our primary goal is to maintain objectivity, we acknowledge our inherent humanity as we strive to provide our readers, viewers, and now listeners with a comprehensive understanding of the ever-expanding cyber landscape. This is precisely why we chose the name for our podcast, "Through a Glass Darkly," drawing inspiration from the biblical expression used by the Apostle Paul, signifying a limited clarity when it comes to envisioning the future.
Our discussions often involve speculation about what lies ahead, eliciting both excitement and trepidation regarding the tech evolution or revolution. As we maintain a strong emphasis on cybersecurity, we find ourselves naturally inclined toward a somewhat "doomsday" perspective, perceiving the world through lenses shaded in darkness rather than rose-tinted hues.
More from Cybernews:
China-linked malware spotted in national power grid
Amazon orders self-publishers to disclose AI-generated content
Microsoft Teams phishing attack targets corporate networks
Accused of strangling competition in court, Google argues quality kept its search on top
CryptoQueen's accomplice jailed for 20 years
Subscribe to our newsletter
Your email address will not be published. Required fields are markedmarked