© 2022 CyberNews - Latest tech news,
product reviews, and analyses.

If you purchase via links on our site, we may receive affiliate commissions.

Deepfakes could facilitate real estate fraud, experts warn


Algorithms capable of creating hyper-realistic images have been available for several years. Now, AI-powered image generators are also easily accessible to all. Experts warn that it is only a matter of time before the technology is used to commit real estate fraud.

A villa in Bali or a cabin in Norwegian woodlands? Perhaps a New York penthouse? If your dream house is a see-through, possibly-wooden modernist structure set in an idyllic nature or a sweeping urban landscape – then This House Does Not Exist has it all.

Except for the actual house, as the website's name heavily implies. An AI algorithm created these images using text prompts by a Twitter user named Levelsio. A new picture pops up each time the page is refreshed.

It appears that Levelsio sought to have AI generate images in the style architecture firms use to sell their projects. While not on par with professionally done renderings, not to mention the real deal, some are quite realistic. Others have flying palms, so it is still a work in progress.

The website that it seems to have drawn inspiration from – This Person Does Not Exist – demonstrates what the technology is really capable of. Instead of houses, it generates an endless stream of highly realistic faces. They were created using GANs, or generative adversarial networks, the same technology behind deepfake videos.

Both static facial fakes and deepfake videos have been used for malicious purposes, whether in Chinese or Russian disinformation campaigns or to scam people by impersonating legitimate businesses. As the pandemic-driven rise in cybercrime continues, how long before the technology is used for real estate scams?

house_collage
Collage of AI-generated renderings. Images from This House Does Not Exist.

Scams big and small

"It is only a matter of time before we see AI-generated images used to commit real-estate fraud," Joshua Rogala, a Winnipeg-based criminal defense lawyer with experience in online property fraud, said.

While the competitiveness of the market and the embrace of doing business online could facilitate the use of AI in real estate fraud, a trusted realtor could eliminate this risk. However, some industry sectors are more vulnerable than others.

"Where I can see this becoming a major issue is off-market," Samantha Brandon, a former realtor, said, pointing at private sellers, social media, and rental markets.

People could still lose money on deposits and applications. An elaborate phishing campaign involving deepfake invoices or bills could compromise their credit card or checking account information. There are also subtler ways of deceit.

"Real estate buyers should look out for malicious uses of AI, such as stretching out a room's dimensions in impossible ways. At the end of the day, the onus is always on the buyer," Danny Pan, VP of data science at SetSail, said.

cabin_sapporo
Some AI-generated images can look quite realistic...

Expanding frontiers

Some industry insiders believe that perpetrators can use AI to deceive and victimize people more effectively than any other technology. David Zhao, managing director at Coda Strategy, sees hyper-realistic deepfake videos based on GANs as posing the most significant risk.

"In the future, as GANs continue to develop, a malicious agent could create deepfakes of security camera footage. They could create a realistic video of an AI-generated fake family, living inside an AI-generated fake home, to offer convincing social proof that an imaginary home exists," he said.

Scammers could look to the metaverse to stage a virtual showing where every attendee sees what they want to see based on what AI learned about their interests and preferences.

"While each viewer may think they are seeing the same version of the property, in reality, each could be seeing a slightly different version," said Nir Kshetri, professor at Bryan School of Business and Economics, University of North Carolina-Greensboro.

As AI-generated images are more and more likely to look, sound, and act like those from the real world, he said, consumers will find it hard to tell the difference between actual and virtual assets.

flying_palm_tree
... others have flying palms. All images from This House Does Not Exist.

Erosion of consumer confidence

Synthetic images could lead to the erosion of consumer trust in digital content if no safeguards are put in place. Experts warn that fake pictures could have a significant impact on people's ability to make informed choices – with the lost capacity to tell what is authentic or not potentially triggering a decision not to purchase at all.

"Will we ever again be certain that anything from an Airbnb to a hoodie from a new streetwear brand from Shanghai is 'authentic,' as we've previously understood it?" Aron Solomon, chief legal analyst at Esquire Digital, said.

While the technology is still emerging, it can no longer be described as a novel threat. What is new is that it is now widely available. According to Daniel Wu, researcher at Stanford AI Lab, the technology now offers not so much an increase in capability but a decrease in a "barrier to entry."

"A skilled and committed individual can spend a couple of hours to make a realistic photo in Photoshop, but now anyone can do the same in a couple of seconds," Wu noted.

Trust your realtor

Sean O'Brien, of Yale Law School Privacy Lab, noted that existing forensic tools could effectively detect the source code used to create synthetic images. "Humans, on the other hand, are much easier to fool," he added.

Working with a reputable realtor or broker could significantly reduce the chance of falling victim to real estate scams – of any kind.

"AI-generated art will be utilized for both good and malicious intents, just like everything else in this world. It's imperative always to exercise caution and perform your due diligence," Brandon, the former realtor, said.

Keeping up with technological advancements can challenge policy-makers but is also crucial, as is public awareness.

"Deterrence can only come from our ability to identify and educate against malicious usage and codifying appropriate applications of artificial intelligence," Pan, the data scientist, said.


More from Cybernews:

The Ethereum Big Merge: panacea for climate change or goldmine for scammers?

Alex Lashkov, Linguix: “I am a huge believer that AI tools will enhance humans, not replace them”

The existential threat of quantum computing – interview

The metaverse – a new Wild West of privacy issues and criminal concerns

Extortion gangs no longer avoid hitting Russian firms

Subscribe to our newsletter



Leave a Reply

Your email address will not be published. Required fields are marked