Is AI normalising creative theft?

Randy Ginsburg
June 9, 2023
At what point does AI cross the line from inspiration to industrialised theft?

Generative AI tools are democratising creativity. But who truly owns the work they make? We navigate the tricky bridge between inspiration and industrialised theft.

In less than a year, artificial intelligence has unlocked previously unthinkable possibilities for art and culture, with the emergence of tools like DALL·E 2, Stable Diffusion, and ChatGPT transcending the boundaries of human creativity.

But regardless of the model, one thing is true for all AI tools: the output will only be as good as the underlying data that the model is trained on. That’s not to suggest that these models don’t have much to offer. OpenAI’s DALL·E 2 is trained on 650 million images, according to the company’s release last year. Competing image generation tool Stable Diffusion uses a dataset of 2.3 billion image and descriptor pairs.

A robot artist at work, generated by DALL·E 2.

All this has given creators a diverse war chest of inspiration. And not just artists and writers. These AI tools let anybody become a creative, making it easy for a normal user to spin up high-quality creative output in a few minutes. Meanwhile, for the professionals, it gives them more time to focus on the other aspects of their work, like developing a concept and telling the story, as well as to experiment in new ways.

But as creatives across the globe turn to AI for content creation, the issue of attribution looms large. Stable Diffusion reports over 10 million daily active users, meaning that tens of millions of images are generated anew every day. But all the creativity comes from the artistry in the data set, so who really owns the output? The growing tsunami of AI content has begun to raise important questions around copyright, ownership, and the role of humans in the creative process.

Earlier this year, a class action lawsuit was filed against Stability AI (the makers of Stable Diffusion) and other text-to-image AI providers DeviantArt and Midjourney for copyright infringement, accusing the companies of using artists' copyrighted works in their training data. Stability AI has also drawn the attention of Getty Images, which sued the AI company for copying at least 12 million of its images without permission. OpenAI hasn’t escaped legal action either; they’re currently in court defending a complaint that their AI programming tools rely on “software piracy on an unprecedented scale.”

An AI-generated image featuring the art of Frida Khalo.

Last month, the government of Japan reaffirmed that the nation would not enforce copyright on data, like images, used for AI training. The decision by the world’s third-largest economy is the first move in the battle for copyright. Japan’s decision has been seen as an attempt to help the nation catch up with the West in developing cutting-edge AI models. But at what point does AI cross the line from inspiration to industrialised theft?

“I like to know where my work is from,” says Wise, a widely-respected digital art collector and Community Manager of the NFT marketplace, Objkt. The Australian argues that understanding the provenance of digital art is no less important than understanding the provenance of physical art. “I want to know the original source because, even though the end image was created by the artist, there were still different images by different artists used to create that work,” she explains. “The same applies to whether the art was created using AI.”

“I like to know where my work is from.”

Wise, Community Manager, Objkt

Wise emphasises how generative AI tools would be nothing without learning from the works of other creators. In turn, she believes the output should be attributed to the artists that fuel it, with each AI-generated work providing a breakdown of the different works that inspired its creation. 

Others emphasise how the intermingling of artistic ideas is a natural part of any creative discipline. “People learn to draw from copying the styles of others,” notes Pixlosopher, a pseudonymous AI artist active in the discipline since 2020. “AI is just a different tool that allows access to more people. In my case, it’s helped me enhance my style and expand my abilities to bring concepts to life.”

At what point does AI cross the line from inspiration to industrialised theft?

It’s this perspective that is common within the more technical side of the AI ecosystem. “I don’t mind when someone uses my work to inspire another research paper,” says Fatir Malik, an enterprise data scientist. “This is already happening. A content writer pulls research and ideas from multiple sources to create their own, for example. I don’t see training models using my work as unethical.” 

“Generative AI tools don’t work by themselves.”

— Pixlosopher, AI artist

When it comes to attribution, Pixlosopher views AI as a tool. And like all tools, it needs human creativity to flourish. For that reason, he asserts that the final creator of the work should be the one who receives the attribution.  

“Generative AI tools don’t work by themselves. They need prompting, post-production, iteration, and often the final product may be a combination of multiple AI outputs,” he says. “It’s similar to how people once questioned if tools like CorelDraw, Photoshop, or CGI were creative or artistic.”

At present, there is no standard method for crediting the artists whose work has influenced AI-generated output. That in itself emphasises how little society understands these AI technologies that are quickly spreading across the world. It is clear that society needs to discuss these questions, and fast.

“AI is just a different tool that allows access to more people” — Pixlosopher, AI artist
“AI is just a different tool that allows access to more people” — Pixlosopher, AI artist
Click to view our article about art.
Written by
Randy Ginsburg
Click to view our article about art, music, film, and storytelling..
More about
Technology
Click to view our article about art.
More about
Questions

Randy is the founder of Digital Fashion Daily and Third Wall Creative, a web3 marketing agency. Straddling the worlds of retail and emerging technology, Randy has worked with many companies including nft now, Shopify, and Touchcast to create compelling and educational web3 content. Previously, Randy worked at Bombas, developing the most comfortable socks in the history of feet.

Suggested