Book review: “The Secret Life of Data” is fascinating and disturbing

We’ve essentially given up and succumbed to the fact that some corporation or government agency is constantly gathering our personal data. However, a new book, The Secret Life of Data, argues that this is still frightening and even threatens open societies as we know them.

Most of us smartphone owners with a bit of spare cash and an unhealthy addiction to social media don’t even stop to think about what we give up when we arm ourselves with smartwatches, home internet appliances, apps, games, and streaming devices.

On the outside, not much, really. We buy, consume, then get bored, and look for a new shiny gadget. Even when we know that our data goes somewhere, is sold, traded, crunched, or whatever, we sort of hope for the best and presume nothing too bad can happen.

Well, Aram Sinnreich and Jesse Gilbert have some bad news. In their book, “The Secret Life of Data: Navigating Hype and Uncertainty in the Age of Algorithmic Surveillance,” they convincingly demonstrate that someone is always going to use our data for one purpose or another.

Sadly, the multiple examples the authors of the book give are of the nefarious kind. Still, they do a good job making this realization that we are constantly for sale neither dystopian nor utopian. That’s because it’s still on us whether the data-driven social order will change for the better.

Our data is not ours anymore

The topics themselves – data, digital surveillance, modern forensics, and Gen AI – are indeed quite heavy. The authors clearly attempted to liven up – and succeeded – this information-heavy book with metaphors, historical comparisons, and popular culture references.

Having interviewed dozens of experts and more regular people whose data was misused obviously helped to explore the scenarios and contexts, too.

That’s perhaps why a reader more or less grasps the difficult concepts that Sinnreich and Gilbert have explained and analyzed, mostly related to us and our data being used by others in the present and in the future.

They’ve also made the reader curious by building the book around the premise that data – no matter its form or purpose – will always have a secret life. We all like secrets, don’t we? Then again, what if the secret is you?

“Billions of us now spend our days and nights enmeshed in webs of digital sensors, machine learning algorithms, and overlapping information networks, all designed to reduce the minutiae of our lives into discrete data points,” the authors write.

Sure, these new data systems may have improved our lives and infused new knowledge into our brains, but fresh conflicts and controversies have also arisen.

Data is everywhere. Image by Shutterstock.

In fact, even though there are now so many sensitive dilemmas over the use of our data, most governments have been lagging behind and failing to introduce regulations on even the most basic concepts, such as data rights.

To be fair, the vast majority of us also fail to understand what’s going on. But if that’s what those directing our data to their secret premises want, this book can serve as a timely reminder – our data is not ours anymore (here’s a recent example).

“There is no limit to the amount and variety of data – and, ultimately, knowledge – that may be produced from an object, event, or interaction, given enough time, distance, and computational power,” explain the authors, helpfully adding that the latter is growing every day.

“It means that whatever we think we’re sharing when we upload a selfie, write an email, shop online, stream a video, look up driving directions, track our sleep, ‘like’ a post, write a book, or spit into a test tube, that’s only the tip of the proverbial iceberg.”

Sharp questions, depressing answers

The Secret LIfe of Data focuses primarily on the long-term consequences of humanity’s rush towards digitizing, storing, and analyzing every little piece of data about ourselves and our world. But what makes this tale a rather enjoyable one is all these little questions that most of us ask from time to time.

For instance, how can a judge issue a warrant allowing police to investigate an individual suspect in a database for a specific crime without exposing all of the other people listed in the database to unwarranted surveillance? Is predictive policing ok?

Or, “How can a technology user confidently wear a smartwatch or install a data-invasive health-tracking app when a future court decision might turn their digital medical records into evidence against them?”

Is the reason Gen AI models are so good – and getting better each day – at fooling human observers the fact that they’re programmed to fool themselves? If so, what does that mean for our future as humans?

The authors of the book correctly point out there’s just so much data in, around, and about us that if it’s missing, it’s also a form of data – sure to catch the attention of bad actors or Big Brother types in the government.

At times, it’s easy to immerse yourself in this Big Data Blues and get swept off your feet whilst reading this book. Yes, data is everywhere. Our photos are data, our purchases are data, our sleeping habits are data, our dates are data. And they all have secret lives.

Is the reason Gen AI models are so good – and getting better each day – at fooling human observers the fact that they’re programmed to fool themselves?

“Then our data spawn more data, and those data are analyzed and recombined to become other people’s data. Small data are aggregated into big data. Even the absence of data becomes a form of data, and real data can be used to generate fake data,” write the authors.

“No wonder so many of us have given up hope of ever finding our way back to dry land.”

But maybe we shouldn’t? Smartphones are great for everything, right? We do our tasks on our laptops, and smart devices, especially in healthcare, can help save lives. Do we really care that our data, as minuscule as it is, is stored on some random company’s hard drive?

ChatGPT has been integrated into new Apple gadgets. Image by Getty Images.

Unfortunately, Sinnreich and Gilbert are “in the know” and understand the power of data, so yeah, we should care, for instance, to ask how Apple AI and ChatGPT integration will handle our data. That’s because how our data is (mis)used makes all the difference in the world.

“Those data points are used by powerful governments and corporations – as well as everyday people – to make political, financial, and cultural decisions that affect all life on earth,” they write.

“A single entry in a spreadsheet somewhere can mean the difference between life and death for a hospital patient, between bail and jail for an accused criminal, or between a despotic regime and a democratic one for an entire nation.”

Everything’s not lost

The irony also is the fact that these data points simply cannot be a mirror image of the world and the individuals they supposedly represent because injustice and inequity are programmed into algorithms and datasets.

They are easily abusable factoids, affected by the assumptions of the interested party and the inherent bias of the creators of AI systems that are tasked with building a profile. Yes, data has feelings – that’s because the data collector has feelings.

Sinnreich and Gilbert don’t even offer hope to those trying to resist data collection systems, at least to some extent, by, say, blocking website cookies, wearing a face mask in public places, or advocating for laws limiting the use of ad trackers and facial recognition surveillance cameras.

They call it a losing battle because we cannot really know about all the data we produce or how it’s analyzed. Sure, living like a hermit in a cave somewhere might help, but it isn’t really possible these days.

Still, according to the authors, it’s not all doom and gloom. News organizations and activists regularly report on tech firms' various excesses, authors contemplate these enormous technosocial changes more often, and some governments do indeed act.

Under Joe Biden’s administration, the US seems more eager to regulate big tech giants more aggressively, and the European Union has adopted the AI Act, the world's first comprehensive regulation of AI.

Most importantly, the authors do a good job of acknowledging and stressing that magic, quick solutions aren’t possible.

Without a doubt, some would argue that hypocritical data capitalist industries, always pledging to self-regulate and be good but silencing any who try to blow the whistle from within (Frances Haugen did go public, and we should thank her), deserve to be punished here and now.

But the tech giants are simply too big now, so slow fixes are the answer, say Sinnreich and Gilbert, urging the world, primarily the US, to at least create new laws and regulations, establish data rights, and properly define humanity’s relationship to data.

“There are a lot of smart, ethical people working hard to chart a path that will help information technology benefit the human condition while minimizing some of its most troubling consequences,” write the authors.

“And while there might not be a single quick fix, there are myriad potential slow fixes, each of which can play a role in promoting human agency, justice, and equity in a data-rich society.”

Yes, data societies are surveillance societies by definition. But we would do well to remember that even if they look like they’re about tech, they’re not about tech only. They’re about civil rights.