© 2021 CyberNews - Latest tech news,
product reviews, and analyses.

If you purchase via links on our site, we may receive affiliate commissions.

Don’t trust this stranger: they’re a fake

2

Images generated by algorithms are reportedly used for malicious purposes, such as misinformation or harassment. The truth is, this technology can also be used to hide your identity when signing up for some iffy websites. Data engineer George Paw created a fake person generator “out of boredom and curiosity” and explained CyberNews how his web app is different from others.

The graphics card maker Nvidia developed a technology known as GAN (generative adversarial networks). Among other things, it lets you create fake human profiles. As with any other innovation, it was quickly adopted by bad actors as well. The Financial Times reported that GAN-generated faces were used in campaigns linked to China pushing pro-Being points, and Russia (who used it to create fictional editors).

In February 2019, The Verge reported about the website ThisPersonDoesNotExit.com. Philip Wang, a software engineer at Uber, used Nvidia’s technology to create an endless stream of fake portraits. Just hit the refresh button, and each time you’ll see another fake stranger staring at you from your desktop.

It seems that it really was, as The Verge put it, just a polite introduction to the technology.

CyberNews came across yet another fakes generator. During the COVID-19 pandemic, data engineer George Paw, currently working in Australia, created a fake person generator Fakes.io. It’s different from similar projects, such as ThisPersonDoesNotExit.com, mainly in that it also generates fake person profiles for the user, which include the fake’s full name, gender, date of birth, height, weight, and favorite color.

“Fakes.io generates brand new, never seen photos on demand. There are currently millions of image combinations that can be generated, and no two photos will look the same,” George Paw said.

CyberNews sat down with him to discuss his product and how it came to be.

George Paw

Also read: Can we still believe what we see online?

A child of “boredom and curiosity”

Like many others, locked at home by the pandemic, George Paw had nothing much to do.

“I created Fakes.io mainly out of boredom and curiosity,” he said. During the COVID-19 pandemic, he was asked to go on annual leave for three weeks. Yet, because of all the restrictions, he couldn’t travel or even leave his house.

“So I came across a GitHub project called StyleGAN, created by the graphics card maker Nvidia. StyleGAN was able to generate extremely human-like profile pictures, but these humans have never existed. It was completely generated by a type of Artificial Intelligence framework called Generative Adversarial Network,” he explained.

Pictures, he added, looked great but lacked flair.

“I decided to attach names to these faces. Then birthdays, heights, weights, even their favourite colours. Suddenly these nameless faces have some semblance of personalities,” Paw said.

That’s how he decided to build a web app that users could use to generate new profile pictures on demand.

“I had in mind that this could be used for software testing purposes (for example, to test signup frameworks), or for users who want to maintain anonymity when signing up for iffy websites where the signup users do not trust the website's privacy policies,” he said.

How is his fake person generator different from other similar projects? Paw admitted that ThisPersonDoesNotExist.com partially inspired the creation of Fakes.io.

“It shared some common underlying AI framework & libraries to generate the hyper-realistic images. The main difference is that thispersondoesnotexist.com displays a certain number of images from a cache (approximately 100,000 images). Fakes.io generates brand new, never seen photos on demand. There are currently millions of image combinations that can be generated, and no two photos will look the same,” he explained.

Most machine learning models require an existing dataset to work on. Fakes.io model was trained on over 200,000 pre-existing images to generate the best results.

“Once the model has been created, it never uses the pre-existing images again, and the machine learning model will generate a brand new image every time,” Paw said.

Can you tell apart the fakes?

George Paw primarily created Fakes.io for software development and machine learning showcase purposes. But it’s accessible to anyone - you can start hitting the refresh button to generate brand new fake persons and use them, for example, to sign up to some websites where you want to remain anonymous.

“We do not condone, support, or encourage illegal activity of any kind and will fully cooperate with law enforcement organizations if required,” emphasized Paw.

If you have a closer look, you might be able to spot a fake. Recently, The New York Times published a detailed analysis of how you can do just that, in case you want to dive deeper into the topic.“While the generated pictures are hyper-realistic, the AI is far from perfect. Sometimes the AI will generate an image that has a distorted background which looks very strange. In some photos, you can see artifacts (things that do not look natural), for example, the ear is slightly warped, or the teeth have been misaligned,” Paw said.

Comments
Connie Lingus
Connie Lingus
prefix 8 months ago
Pretty impressive, although the AI isn’t very good at doing ears…
Michael
Michael
prefix 8 months ago
Typo: thispersondoesnotexit.com should be thispersondoesnotexist.com

Kind of ironic, don’t you think?

~ Mike
Leave a Reply

Your email address will not be published. Required fields are marked