Along with the celebration of artificial intelligence (AI) in the last six months or so, there has also been a great number of concerns raised. From jobs being replaced, all the way to killer robots – leave it to the 24-hour news cycle to create a click bait craze.
One particular group feeling especially nervy are creatives, especially those in the visual art space. Just last week German photographer Boris Eldagsen refused a Sony world photography award as he revealed that his entry was entirely the work of artificial intelligence – to spark a debate about the use of AI in the industry.
A crucial component in AI’s development is in training data, and one noteworthy source of training data is human creativity. Responding to the nervy, it’s worth a look into the ethical concerns arising from utilizing people’s creative output to train AI tools like the newly released Chat GPT Code Interpreter, or image generators like Stable Diffusion, and Midjourney.
A primary ethical concern in using human creativity for AI training is of course the potential exploitation of creative professionals. Writers, artists, musicians, and more depend on their creative skills for income. However, utilizing their work as training data for AI might result in inadequate or no compensation for their contributions, leading to the serious devaluation of creative work.
“One particular group feeling especially nervy are creatives, especially those in the visual art space.”
The hard reality is that all professionals will need to compete with AI in the job market in coming years. Let’s also note that users interacting with these AIs may be unaware that their inputs or conversations contribute to training the model, raising questions about informed consent and data privacy.
So is it a lost cause? Not so fast. Blockchain technology, a distributed ledger that duplicates and distributes transactions across the network of participating computers presents a potential solution to these ethical concerns. By employing blockchain, a decentralized platform can be established where creative professionals upload their work and receive fair compensation for its use in training AI models.
Throwing in smart contracts, self-executing programs that automate the actions required in an agreement, can ensure that creators are automatically compensated without intermediaries. Waterlily, recently launched on the Filecoin Virtual Machine (Figure 1), is a new project enabling artists to be compensated for uploading their work and helping train the underlying AI model with the tag, ‘Ethical generative AI-art’. While there is still a long way to go, it is becoming increasingly clear to many minds that the data (particularly when it is created by human creativity) is the valuable aspect of this value chain and that data therefore needs to be protected.
Figure 1: The Waterlily AI home page
It is not to say that a Web 2.0 company can’t offer ethically generative art. Microsoft had a project running through it’s Bing search engine that was doing just this but there is very little transparency offered by these big companies working in silos. Blockchains, or Web 3.0 can offer complete transparency and traceability across all internet transactions, including the use of creative works for AI training.
Filecoin, the decentralised storage network, can give each individual piece of a creative’s data a unique identifier and then smart contracts can ensure that royalties (even if it’s a fraction of a cent) can be paid to the owner of that data when used by an AI. An incredibly popular image tool may have millions or even billions of interactions per day. Creatives aren’t known for their math skills, but it’s safe to say that it might add up.
Most importantly, AI development is moving fast, but these ideas and questions need to be asked. It’s easy to sometimes feel powerless against the tide of innovation – especially as a creative – but the more time we spend looking at how these tools operate, the more we realise how incredibly important the underlying data is. It should be protected, and can be protected – the infrastructure to maintain the AI/human balance is being built. More royalties, less killer robots.
Disclaimer: This Article has been prepared by Holon Global Investments Limited ABN 60 129 237 592. Holon Global Innovations Pty Ltd (“HGI”) is a wholly owned subsidiary of Holon Global Investments Limited (together “Holon”). HGI is a Filecoin (FIL) Storage Provider and is positioned as a major player in the FIL decentralised data storage arena for Asia Pacific. FIL Storage Providers are rewarded in FIL for the provision of data storage capacity. Holon, its officers, employees and agents believe that the information in this material and the sources on which the information is based (which may be sourced from third parties) are correct as at the date of publication. While every care has been taken in the preparation of this material, no warranty of accuracy or reliability is given and no responsibility for this information is accepted by Holon, its officers, employees or agents. Except where contrary to law, Holon excludes all liability for this information.