Deepfakes, digital forensics and the battle against AI crime
When wandering the dark streets after a night out, most people reading this will rely on their streetwise instincts to keep them safe from harm. But when going online, many are guilty of lowering their guard. Here in 2020, it's easy to hit the install security update later button for several days or even weeks. The threat of ransomware and phishing attempts is a threat to your identity and your wallet or purse.
There is a new wave of digital crime coming our way that requires everyone to up their game by adopting a much tougher cyber streetwise attitude.
Edgar Allan Poe once said: “Believe nothing you hear, and only one half that you see.”
But advances in technology and the rise of deepfake technology in this digital age should be enough to make you question everything that you see.
Rewriting history with deepfakes
Deepfakes are AI-generated videos that enable tech enthusiasts and often nefarious characters to produce content where real people do and say fictional things. Predictably, the experimental phase with this tech involved tinkering with classic movies such as The Shining and Back to The Future by changing the cast with eerie results.
However, things quickly took a dark turn as privacy advocates warned of the dangers ahead. To highlight the scale of the problem, The MIT Center for Advanced Virtuality project used the technology to share a video of President Nixon making a fake moon landing disaster speech. The video served as a warning on the threat that deepfake technology could bring to democracy, and how easy it could be to rewrite history.
As the digital threat landscape continues to evolve, we need to prepare for crimes much more sophisticated than ransomware and phishing. Deepfakes could be used to manipulate the opinions of the global community but also threaten both individuals and organizations. Open-source repositories such as GitHub are also making it easy for cybercriminals to produce deepfakes with minimal technical expertise.
Many mistakenly believe that it's only politicians or the rich and famous that are at risk. But fast forward a few years when this technology is widely available, it is likely to be your friend or CEO rather than a movie star that is deepfaked.
In a future where phishing attempts will be deemed primitive, we can safely predict that cybercriminals will upgrade their tools and attack methods.
Imagine your boss sending out an urgent video message asking staff to update their banking details two days before payday. Worse still, imagine a CEO tweeting and updating their LinkedIn status with a video announcing massive redundancies that would immediately impact the stock market and value of the company.
Of course, these are just hypothetical scenarios, but we need to prepare for the dangers waiting in the murky uncharted waters ahead.
Mark Cuban once famously declared that data is the new oil. Tech companies quickly jumped on the bandwagon, hoovering up every click, swipe, credit card transaction, and digital interaction from every device we own. The rise of surveillance capitalism and the myriad of techniques to track our personal data is well documented. But the DNA that makes us unique be the next frontier for personal data storage?
Increasing interest in genealogy prompted many families to send a small saliva sample to ancestry sites to find links to long-lost relatives. Despite the warnings from privacy advocates on the dangers of sharing your DNA with corporations, it quickly became a big business and a favorite gift for those who are notoriously difficult to buy presents.
Sites such as Ancestry assured customers that police would not surprise its customers when investigating an unsolved crime. But could that change in the future? The recent news that a private equity firm acquired Ancestry, and gained access to its wealth of DNA records for $4.7B set off a few alarm bells.
In the 1997 movie Gattaca, we were given a glimpse of a future of indefinite eugenics. The alternative reality served as a warning of what could happen if we allowed our DNA to determine our life course. The prospect of our dating and employment opportunities based on our DNA seemed ludicrous at the time. But not so much now.
Could uploading your DNA to a private company threaten the privacy of everyone in your family forever? And will it be deemed by future generations as the moment we unwittingly embraced voluntary eugenics?
Google's proposed acquisition of Fitbit appeared on the radar of regulators who are increasingly concerned with the amount of personal data that tech behemoth can access. But it's no longer just smartphones and watches that are gathering data. I have already explored how your smart home can betray you when continuously gathering lifestyle data from your household appliances.
Criminal investigations are turning to digital forensics to solve crimes.
Many are blissfully unaware that our devices outside the home record and store data around our heart rate and GPS movements. Digital forensics enables law enforcement to build a narrative of events around any crime.
When a suspect attempted to use a smart washing machine as an alibi, the police turned to Cranfield University to determine if he was at home washing his clothes.
They were able to determine that the suspect had used his mobile app to activate the washing machine while the phone was located next to the crime scene. Incriminated by his own smart devices, the defendant was later proved guilty.
Both the good guys and the bad guys are leveraging technology in a constant game of cat and mouse. But online criminals do not have to comply with rules and regulations. The bigger question is how authorities can level the playing field and protect its citizens without infringing their privacy. With crime migrating from the streets to our online world, maybe we should all think about taking digital self-defense classes to prepare for a new range of cyber threats.