AI is an opportunity for humans rather than machines, author insists
The information revolution is actively changing the way we live – and not always for the better. But if you’re rushing to blame artificial intelligence (AI), think twice and look in the mirror: that’s what a business psychology professor is advising in his new book.
The craze about generative-AI tools such as ChatGPT has given birth to a bunch of enthusiastic predictions about how this will change our world for good – and for the better. Everything will be possible and much easier at a single click or prompt, some say.
Others, like songwriter Nick Cave, have expressed concern. If AI is evolving at such breakneck speed, we’ll soon be able to do so many more things – but maybe, just maybe, one day AI will simply take over and we, ordinary humans, will not be part of the revolution.
Not so fast, Tomas Chamorro-Premuzic, professor of business psychology and chief innovation officer at ManpowerGroup, says in his latest book, I, Human: AI, Automation, and the Quest to Reclaim What Makes Us Unique.
“We may well be the sum of everything we do, but the mere sum of what we do is insufficient to explain who we really are or why we do what we do.”Chamorro-Premuzic said.
There is too much internet and too many algorithms, he acknowledges. In the AI age, we’re becoming more impatient, less curious, insufficiently creative, and too boring and predictable (the algorithms thank us for this, by the way).
We’ve also democratized digital narcissism and, thanks to Google AI, we’re given rapid, superficially clever answers before we’ve actually finished phrasing our questions – even though asking the right questions is probably more important than getting answers.
This may sound depressing but it’s true, according to numerous studies cited in the book. However, Chamorro-Premuzic urges human beings not to lose hope, and fight back by actively rewiring our brains – because “importantly, the key goal is not for AI to replace human expertise, but to enhance it.”
Weapons of mass distraction
Having studied AI in his career, the professor wastes no time in concluding that, so far, its main accomplishment has been to reduce uncertainties in everyday life, thus depriving it of charming unpredictabilities – and lessening the quality of the human experience.
We are the ones doing it to ourselves, though. Each time we spontaneously but predictably react to AI, we advance its predictive accuracy and lose another small battle against creeping automation.
Plus, we increasingly spend more and more time online each day – so much so, our current world for quite some time now is a version of the 1980s Otaku subculture in Japan, in which teens escaped the real world to live in a universe of manga and anime characters, an early version of the metaverse.
And of course, today, when we’re spending hours clicking, swiping, posting, sharing, and simply staring at our screens instead of doing something useful, it isn’t hard to conclude that we’re being bombarded with weapons of mass distraction.
Decades ago, the psychologist and Nobel laureate Herbert Simon was already pointing out that humans were struggling with information overload, and that “the wealth of information” we encounter every day creates a “poverty of attention.” It’s true today more than ever.
What does your average working day look like? Be honest. Aren’t you sitting in front of at least a couple of screens with multiple tabs open, wearing headphones that are supposed to be noise-canceling but are in fact playing tracks from Spotify? If it’s a home office, the TV is probably on in the background, too. And, while you’re at it, checking your socials every 10 minutes.
We could go on and on, and Chamorro-Premuzic does just this, in each concise yet informative and thoughtful chapter of his book. At just 166 pages minus the notes and index, it’s not long – but it’s thorough and well thought out.
He describes the current situation as a battle for focus, for even mere seconds of our attention: “It has reached epic levels and has been intensified by data-driven metrics such as clicks, likes, views, and tags, which are critical to improve AI’s ability to understand and influence consumers.”
The algorithms know us too well
The effect of all this on our brains remains to be seen – the most enthusiastic AI consumers won’t grow old and sickly for a while. The signs are worrying, though, Chamorro-Premuzic says.
The book’s author pointedly quotes Nicholas Carr, an author writing about the intersection between technology, commerce, and culture who claims that repeated exposure to online media mostly means exchanging focus and critical thinking for fast autopilot-like processes.
“The internet is an interruption system. It seizes our attention only to scramble it,” Carr wrote in his own book The Shallows: What the Internet Is Doing to Our Brains, published in 2010 and expanded a decade later.
Professor David Meyer, a leading scholar quoted in the book, went further back, to an era when the tobacco industry was booming: “People aren’t aware what’s happening to their mental processes in the same way that people years ago couldn’t look into their lungs and see the residual deposits.”
“If my model of you is that of a human who will spend their days looking at various screens and clicking away, tapping in, scrolling down different pages in ever more repetitive fashion, even a computer will be able to understand who you are.”the author says.
Levels of productivity are plummeting, too. Yes, there was hope in the late 1990s and early 2000s, when the knowledge economy expanded, but, as the Economist noted, “Since the mid 00’s productivity growth has tumbled, perhaps because the burden of distraction has crossed some critical threshold.”
Some estimates suggest that workers reach for their smartphones for non-work-related activities as often as twice a minute, and that the task recovery time after a typical digital interruption – for example, checking your email, sports results, Facebook, or Twitter while at work – may be as high as 23 minutes.
I’m multitasking, we hear you say. That, however, is a myth – because ‘multitasking’ is in fact nothing more than simply task switching. Chamorro-Premuzic quotes estimates that indicate that multitasking deducts the equivalent of 10 IQ points from an individual worker’s performance.
Our patience has been melting away, too. We’re now able to choose a date or job very quickly – however, one needs to give said date or job a proper shot, and the AI age seems disinterested in our capacity to wait and delay gratification. Patience is now a lost virtue.
Finally, the book reminds us that social media plays – in various ways – to our confirmation biases. The algorithms know what we like and feed us stuff that fits our view of the world, thus narrowing the streets of the internet expanse down to personal dead-end alleys.
“To paraphrase comedian Patton Oswalt, in the Sixties we put people on the moon with computers less powerful than a calculator. Today everyone has a supercomputer in their pocket, and they’re not sure if the world is flat or if vaccines are filled with wizard poison,” Chamorro-Premuzic writes.
Now hold on a minute…
It’s not all bad news. For example, AI can actually help us fight biases in decision-making, because it is by definition neutral, unprejudiced, and objective.
If you think it isn’t, don’t blame AI – blame humans who get a kick out of making chatbots say racist or sexist things. It’s a case of “don’t shoot the messenger” because it’s the dark side of us – not AI.
Humans are still very much in the driving seat in this race called life – that is, we really are in control of what AI is or is not used for. Chamorro-Premuzic has some advice on how we can tame it, “escape this world of ubiquitous self-absorption,” and enhance humanity.
First of all, we have to try to understand our limitations and avoid overestimating our talents. In the real world, being humble converts to higher degrees of likability – in the real world, nobody likes narcissists, and confidence is definitely not a sign of competence.
Second, we should simply do more. If AI frees up so much time and takes over the most boring or trivial tasks, we could spend this extra time in higher-level intellectual or creative activities – physical workouts are also fine, of course.
Obviously, it’s not easy, as AI tries to automate even our thinking and decision-making, neutralizing our curiosity at the same time. But just try harder, Chamorro-Premuzic urges, and be less predictable.
“If my model of you is that of a human who will spend their days looking at various screens and clicking away, tapping in, scrolling down different pages in ever more repetitive fashion, even a computer will be able to understand who you are. Acting like a robot makes us more familiar to robots,” the author says.
In other words, consider wresting your free will back from the claws of automation. If Amazon chooses things for you to buy, if Netflix shows you what to watch – how much free will do you really have?
“We may well be the sum of everything we do, but the mere sum of what we do is insufficient to explain who we really are or why we do what we do. In that sense, AI’s curiosity is rather limited, at least compared with humans. Even when we are unable to predict what we or others do, we have the capacity to wonder,” Chamorro-Premuzic reminds us all.
One final piece of advice – just slow down, even if only for a minute. If you’re running very quickly in the wrong direction, you will only end up further away from where you need to be, fooled by the illusion of progress and activity.
As Lewis Carroll’s cat tells Alice, “If you don’t care much about where you want to go, then it really doesn’t matter which way you go.”
More from Cybernews:
Cyber women: if they got into cybersecurity, so can you
ChatGPT and the slow decay of critical thinking
Human artists feel threatened by AI – perhaps they shouldn’t just yet
AI has changed photography forever: but that isn’t necessarily a bad thing
Is it real or just a fantasy? Cybernews take on ChatGPT hype
Subscribe to our newsletter
Your email address will not be published. Required fields are marked