Stock photo companies are banning AI-generated images

AI-generated images are often perceived as a 21st-century miracle. But one place you won’t encounter them in the future? Stock photo sites.

The power of artificial intelligence to conjure up an image using a text prompt has captured the public’s imagination in the last few months. From DALL-E to Midjourney and Stable Diffusion, the ability to use generative adversarial networks (GANs) to create images out of nothing but internet-based training data has been seen as modern magic.

Yet, some of the world’s biggest providers of stock photography and images have decided to take a stand against the concept of artificial intelligence-generated content, forbidding it on their platforms. Among those who have come down against the use of AI imagery is Getty Images – an American visual media company with an extensive library of content – and the various platforms they own, including iStock and Unsplash.

Tackling AI

“Getty Images recently announced our decision to not accept AI-generated content across Getty Images, iStock, and Unsplash,” a spokesperson told CyberNews. “There are open questions with respect to the copyright of outputs from these models, and there are unaddressed rights issues with respect to the underlying imagery and metadata used to train these models. It is important that we make content available to our customers that is free of these concerns and potential liabilities.”

Getty’s statement unlocks one of the key issues behind AI-generated images: while the produced work may theoretically be new, it is often seen as a blend or composite of pre-existing images contained within the model’s training set rather than wholly new work. There are also unanswered questions about the ownership of AI-generated images and whose copyright they fall under.

AI image
"Théâtre D'opéra Spatial" won first place in the Colorado State Fair, generated by the image synthesizer Midjourney. By Jason Allen

Critics of AI-generated works claim that the models do little more than hoover up often unclassified data, in breach of copyright, that exists on the internet – a photograph of a Michelangelo painting, for instance, that gets uploaded to a website – then crudely copies the style and format of that.

Ownership issues

It’s something that is a legal minefield and one that has yet to be tested. One recent grassroots effort aims to give artists more ownership over how their work is used to train AI models, including identifying whether it has been used to help the ‘brain’ of a GAN at some point in the past.

The goal with that campaign and project is to eventually allow artists to get more control over how their work is used. This includes denying the right to use it as training data for a model in case it too closely mimics their style of work, essentially allowing people to create all-new pieces by a specific artist for free.

That fear – coupled with the potential to benefit from AI, but only responsibly – appears to be behind the Getty group’s decision to end the use of AI images on their websites.

“We believe AI has the potential to further unlock and enhance creativity,” the Getty spokesperson says. “We will continue to support creatives who use tools, including those that may leverage AI, to enhance their original concepts and visual work in line with our acceptance policies.”

But they’ll only do so if they feel confident that it follows the letter of the law.

“We stand ready to work with those who want to advance AI in a socially responsible manner and one that respects personal and intellectual property rights,” they add.

More from Cybernews:

Thomson Reuters collected and leaked at least 3TB of sensitive data

Elon Musk’s finish line: “Chief Twit” visits Twitter HQ, carries a sink

Ukrainian national charged for his role in ‘Raccoon Infostealer’ malware scheme

China-linked threat group sowing discord ahead of US elections, analyst warns

Pope: even nuns and priests watch adult content online

Subscribe to our newsletter

Leave a Reply

Your email address will not be published. Required fields are markedmarked