Businesses are attempting to meet the rising expectations of their customers by providing personalized experiences similar to Netflix, Spotify, and Amazon. However, is everyone jumping on the personalization bandwagon a little too early? There is an increasing argument that tech companies have built an entire ecosystem based on inaccurate data.
I first became aware of this when I was waiting for a meeting to begin. Several attendees were talking about their step counts on their fitness trackers. My colleague shared a story of how he went for a long walk and forgot his Fitness tracker. As this wasn’t recorded, he went out for an additional hour with the tracker to ensure the data was on par with the rest of his week’s activities.
My friend seemed oblivious to the fact that the information recorded on his fitness tracker was still inaccurate. However, advertisers are also trusting inaccurate data from social media platforms such as Facebook. It seems that users and businesses are being played by large tech companies in a sinister game that it created. But how did we get here?
How social media is corrupting human behaviour and misleading advertisers
Imagine someone lurking in the shadows that knows where you are at all times, and by using facial recognition, they could identify every time an image of you when it is posted online. What if that person made you feel worse about yourself, knew when you were sleeping and sold that information to hundreds of advertising companies. Sounds creepy, right?
Over 2 billion people use Facebook products and access their Facebook account alone around eight times per day on average. That’s before they even think about opening Instagram, Messenger or WhatsApp. Essentially, there is one company harvesting data around how everyone’s browsing habits and communications.
When Skinner placed a hungry rat in his infamous Skinner box, he was able to create positive reinforcement by introducing variable rewards. The system was likened to slot machines that were using a similar method to separate the gambler from their money by providing the occasional win. But in a digital age, time has become the new currency, and it’s the only non-renewable resource that we have.
You might be able to earn more cash, but unfortunately, it’s not possible to get back time. There is a war for our attention, and somewhere along the way, users have become the rats trapped inside a virtual Skinner box. According to a leaked report in 2017, Facebook told advertisers that algorithms could enable advertisers to pinpoint the moment when a 14-year-old needed a confidence boost. The platform claimed to have the ability to identify emotional states such as insecure, anxious, stressed, and feeling worthless.
Why marketers shouldn’t trust social media data
The Facebook data model is built on a variable reward system that is changing our behavior. Much like the rats in a Skinner box, every click or swipe users make is unwittingly motivated by the desire to gain more likes or shares. Matt Mayberry, from Dopamine Labs, appeared to confirm this when he said, “it’s common knowledge in the industry that Instagram exploits this craving by strategically withholding “likes” from certain users.”
A quick look at any social media timeline will reveal people desperately attempting to showcase themselves and their lives in the best possible light. Somewhere along the way, many have lost their authentic selves and become their own PR manager, where they promote their self-built narrative. But this knowledge should make advertisers question their data.
Most people accept that social platforms are nothing more than a highlight reel of their best bits. Users for the most part will carefully keep the arguments, bad times, and guilty pleasures far away from their public profiles. Why? Because they have been conditioned to only post the type of content that delivers the variable rewards and another hit of dopamine.
The vast quantities of data that Facebook and Instagram are selling to marketers and software vendors is not an accurate reflection of how users behave in the real world. Sure, it has the power to manipulate its users, but what value is it really providing advertisers? The entire ecosystem is built on the anticipation of variable rewards that are delivered by users trying to beat the system rather than their authentic behavior or interests.
Context is everything
AI-powered algorithms attempt to make sense of our data and provide recommendations on what to purchase, listen to, or watch next. But none of this data has any context. For example, if I choose to listen to a cheesy 80’s soft rock classic several times, it will be because it reminds me of a night out or a happy time in my life.
However, Spotify cannot compute why I love this dated tune and will assume that I love all music from this genre. A lack of context reveals the Achilles heel of the experience economy. Amazon doesn’t know that I purchased a Frank Sinatra CD for a grandparent’s birthday. The reality of this means that the recommendation engine will try to upsell me the Songs for Swingin’ Lovers album every day for a month.
Equally, the moments I accidentally left Netflix and YouTube running, and automatically playing additional content, would confuse the platform’s algorithms too. We are increasingly living inside social and cultural bubbles. In doing so, it’s slowly dragging everyone towards the bland and the safe, which is ruled by a handful of famous artists and movie franchises.
The serendipitous moments where we discover something outside of our comfort zone are slowly being eradicated. Although users have the illusion of choice, the reality is that algorithms are keeping everyone in a loop of convenience and sameness. For these reasons alone, app developers shouldn’t rely on algorithms to understand user behavior.
The consumer didn’t want Jimmy Hendrix, but they got him. And he changed the world. And the consumers didn’t want ‘Sgt. Peppers, but they got it, and they didn’t want the Sex Pistols, but they got it. And now there’s an attitude in the music business of ‘let’s keep the consumer happy because that’s what makes the music business go’ round.’Noel Gallagher
Why personalization shouldn’t be built on inaccurate data
The combination of a lack of context and only sharing our best bits can quickly lead to upsetting results. Gillian Brockell from the Washington Post perfectly highlighted the seriousness of this when she wrote an open letter to Facebook, Twitter, and Instagram. In the note, she asked that the tech giants stop bombarding her with ads for baby products after she had a stillbirth.
“If your algorithms are smart enough to realize that I was pregnant, or that I’ve given birth, then surely they can be smart enough to realize that my baby died, and advertise to me accordingly — or maybe, just maybe, not at all” – Gillian Brockell.
Susan Bidel, a senior analyst at Forrester Research, advised that there is a common belief in the industry that only “50% of this data is accurate“. The more significant problems arise when we look beyond how inaccurate data feeds into recommendation engines about what we want to purchase. It could become discriminatory.
There are many examples of how targeted advertising can be incredibly distressing for those dealing with miscarriage, infertility, or pregnancy problems. It’s not uncommon for users who are mourning the death of a family member or friend, to be reminded to wish them a happy birthday.
Equally, targeting someone with Valentine’s or honeymoon offers a few days after calling off their wedding will negatively impact the company placing the ad. Marketers who are blindly trusting algorithms built on highlight reels as fact are setting themselves up to fail and unwanted negative press.
Why marketers should be wary of an algorithmic worldview
Both users and marketers have been played by the tech giants of silicon valley in an experiment based on variable rewards that got out of control. The end result was the creation of a reality tunnel where confirmation bias has become a digital instinct.
When users find a news story that fits in with their worldview, they instantly hit the like, retweet, or share button. But if they come across something that challenges their viewpoint, it’s easy to discard it and label it as fake news to maintain cognitive consistency. These are just a few reasons why many find it easy to instantly accept information in line with their existing beliefs.
By contrast, information on the opposite side of the spectrum will require many contrary facts before they even think about changing our minds. But how do marketers know which information is correct or if their target audience is being manipulated?
Advertisers are in danger of treating data that consists of inaccuracies and bias as fact. The reality is that life is a beautiful subjective experience, and both users and marketers have a responsibility to not compromise their decisions to fit more comfortably inside a digital world run by algorithms and recommendation engines.
Every user has unique little foibles that their favourite platforms will never understand and I think there is something quite beautiful about that in this increasingly digital world. If marketers genuinely want to understand its audience, they need to scratch beneath the surface and get to know the real person hiding behind their social media profile. Despite what Facebook will tell you, people are much more complex than algorithms.