Artificial intelligence is capable of accomplishing some truly incredible feats. Given that technology tends to progress exponentially, said feats are becoming more innumerable by the day – I mean, Microsoft just dished out $19.7 billion to bolster its AI lineup with Nuance Communications.
There are some aspects of progress that are a little shadier, though. While it’s not necessarily a situation where either side is inherently wrong, I reckon the waters become pretty murky once you use a tool to leverage a professional musician or actor’s voice for your own commercial gain. This is totally new territory, which makes legislating against it fairly difficult – but surely there needs to be some kind of digital rights regulation that protects and preserves a person’s voice as their own property.
While this definitely has an impact on games – for example, a Witcher modder recently used AI to replicate Doug Cockle’s Geralt voice in a fanmade quest – it has far wider-reaching connotations as well. I’m thinking of that Nirvana song that came out last week, which absolutely is not even close to a Nirvana song. Kurt Cobain died in 1994 – the idea that someone could profit off a staunchly anti-capitalist artist’s “music” 27 years after he died via the use of a voice generation tool makes me feel pretty uneasy. It’s like feeding a bot tons of Shakespeare, giving it time to become decent beyond “where art thou Romeo?” and passing off its work for your own material gain.
For what it’s worth, I know how the public domain works. For example, in the UK, where the rule is life plus 70 years, JRR Tolkien’s The Lord of the Rings will become public domain in 2043, meaning it belongs to the public as opposed to the individual author who wrote it. The above point about Shakespeare is obviously covered by this too – Shakespeare’s entire oeuvre now belongs to the people. But Kurt Cobain has been dead for less than 30 years. Geralt’s voice actor is only 50 years old and is still actively performing. To not only borrow their voices, but use them for creative endeavors they are not and cannot possibly be aware of is frankly absurd, especially where money is concerned.
That’s not to say AI generation is inherently bad. Last year I spoke to a Russian AI team called Mind Simulation who were using technology to create a fully responsive Geralt of Rivia. They fed it all of the information from the Witcher games and books, which allowed the AI to understand things about itself and the fictional world around it. This wasn’t done for commercial gain, though – it was an exercise in improving the technology behind voice acting, audio design, and cutting-edge AI. You still need an actor with this tech, it just improves the fictional character’s ability to interact with you or whoever else outside of the fiction is engaging with them, as well as making it easier for their lines to be converted into other languages. If the AI knows everything about Geralt and the AI also knows Russian, English inputs can be instantly translated by default – it improves how accessible games are for people from different countries, and is therefore valuable in a whole other way to the more harmful nature of AI-driven cash grabs.
It’s a strange and unprecedented time we live in. Cutting-edge technology allows for action that has never needed to be legislated before this precise moment, which means that there is a natural period of grace during which people – typically already rich ones – can abuse the lack of measures preventing exploitative behaviour. The least we can do is at least talk about it, though – have the conversation now instead of waiting for new statutes to come into play later. When I saw that Witcher mod, I thought, “Hey, more Witcher!” That’s cool in and of itself, and as far as I can tell it’s a non-commercial project, but the simple fact that it exists means that a fully commercial one could have readily been designed in lieu of it. This technology is already here and is being iterated on daily by who knows how many people all over the world. The time for tackling new-age digital law fit for setting distinct boundaries in relation to the use of AI-generated voices is now, especially when they are explicitly being developed for the sake of real-life replication.
I don’t know about you, but the idea of a Nirvana song coming out in 2021 and being passed off as “what Kurt Cobain would have written today” doesn’t quite sit right with me. I wouldn’t be happy about a game studio refusing to call their actor back because they can just use AI to doctor their voice after a handful of voiceover sessions either. I’m all for using this technology for good – improving accessibility, efficiency, and the technological backbone of a realistic character driven by competent AI – but the waters are far too murky for comfort right now.
Next: Stop Leaving Fans To Patch Bad PC Ports Or Start Paying Them
- TheGamer Originals
- The Witcher
- Geralt of Rivia
Cian Maher is the Lead Features Editor at TheGamer. He’s also had work published in The Guardian, The Washington Post, The Verge, Vice, Wired, and more. You can find him on Twitter @cianmaher0.
Source: Read Full Article