The internet has done a commendable job of mocking NFTs to death, or at least in forgiveness – big game developers like Ubisoft who initially showed interest have mercifully stopped bringing them up – and some are now hoping a “make them so unnice they won’t be touched” tactic can be used to undermine Another trend: the rapidly advancing AI-powered image generators that spit out fake and seductive pictures of our friends and Stills from David Lynch’s Warhammer fantasy films (Opens in a new tab).
I think they will be disappointed. The “art” of AI isn’t going anywhere.
On the one hand, NFTs and AI art are opposites: NFTs promise that every piece of digital artwork can be a unique and valuable commodity, while AI art promises to eradicate the value of digital art by flooding the Internet with endless supplies of it. If Jimmy Fallon wanted to stock up on all those stupid NFT monkey pictures, I don’t think most people would care, but the fast, cheap generation of AI images has made it hard not to see more and more of them. If you have used social media over the past year, you have seen images generated by artificial intelligence.
I highly doubt it is a temporary fad. When investing in blockchain is criticized as pointless waste generation, AI art is lamented because it threatens the jobs of illustrators. Everyone can see the value of a machine that turns words into pictures. It’s hard to resist trying, even if you don’t like it in principle. If someone tells you they have a machine that can make an image of anything, how would you not want to test the claim at least once?
The way we interact with these machine learning algorithms reminds me of the way people tease babies, delighting in all their responses to novel stimuli and pointing out anything that could be taken as a sign that they understand us. When the image creator seems to “get” what we asked for, a strange and amusing feeling ensues – it’s hard to believe that a computer program has succeeded in translating a motif as complex as “John Oliver lovingly looks at his file after realizing he’s falling in love” into an image, but there’s no Doubt on the screen in front of us.
And that’s really what makes AI art offensive to many, I think. Not just work automation, but creative work automation, which sounds pretty obscene. Something seen as deeply human has been turned into a party trick.
The good and bad news for mankind is that sleight of hand can easily be found: image generators do nothing unless they are trained on heaps of man-made artwork and images, in some cases without the consent of the artists whose work was used. In fact, the much popular Lensa AI photo maker The reproduced signatures are garbled (Opens in a new tab): the mutilated corpses of real artists who have been fed with it.
An early attempt to rescue the art of AI from this criticism is easily dismissed, if you ask me. The claim alleges that by scrapping online artists’ portfolios of training material, the AI art generators are “doing just what human artists do” by “learning” from existing artwork. Sure, humans learn in part by mimicking and building on the work of others, but casual anthropomorphism algorithms crawling into millions of images as living beings really quick to go to art school isn’t a position I take lightly. It’s totally too early to grant human nature to silicon chips just because they can now spit out pictures of cats on demand, even if those images sometimes look like they’re man-made.
I’m cropping it out for privacy reasons/because I’m not trying to call anyone out. These are all portraits from Lensa where the distorted remains of the artist’s signature are still visible. These are the signature remains of one of the many artists that were stolen from them.December 6, 2022
Beyond flattering images
The interesting thing about AI-generated images to me is that they usually No It looks man-made. One of the ways in which machine learning’s inhumanity manifests itself is in its lack of self-awareness. AI art makers don’t tear up their own failures, get bored, or become frustrated with their inability to depict hands that could exist in Euclidean space. They can’t judge their work, at least not in any way a human can handle, and that fear leads to surprising images: images we’ve never seen before, and which some artists use as inspiration.
Rick and Morty creator Justin Roiland played with the AI art generation in the making of High on Life, for example, Sky News says (Opens in a new tab) that she helped the development team “come up with weird and funny ideas” and “make the world feel like a strange alternate universe to our own.”
Image generation is one of the ways machine learning is used in games, and they are already full of procedural systems such as level generators and dynamic animations. For example, a young company called anything in the world It uses machine learning to animate 3D animals and other models on the fly. What might a game like No Man’s Sky, whose procedurally generated planets and wildlife stop feeling new after so many stellar leaps, look like after another decade of machine learning research? What would it be like to play games in which NPCs could behave in truly unexpected ways, for example, by “writing” unique songs about our adventures? I think we’ll probably find out. After all, we have Favorite RPG of 2021 It was a “procedural storytelling” game.
I don’t want Epic to be a company that stifles innovation. I was on the wrong side of that many times. Apple says “you can’t make a payment system” and “you can’t make a browser engine”. I don’t want to be a “You can’t use AI” company or a “You can’t make AI” company.December 25, 2022
As valid as moral objections may be, the expansion of machine learning into the arts — and everything people do — currently looks like a ship wrecked on an island. At the end of Speed 2: cruise control. (Opens in a new tab)
Users of ArtStation, the host of the art portfolio, which was recently acquired by Unreal Engine and Fortnite maker Epic Games, protested the unauthorized use of their work to train AI algorithms, and Epic added a “NoAI artists can use” tag to explicitly not allow the use of content by AI systems. But that doesn’t mean that Epic is generally opposed to AI art. According to Epic Games CEO Tim Sweeney, some of its artists consider the technology “revolutionary” in the same way that Photoshop was.
“I don’t want to be a ‘You can’t use AI’ company or a ‘You can’t create AI’ company,” Sweeney said he said on Twitter (Opens in a new tab) . “A lot of Epic artists are experimenting with AI tools in their hobby projects and see it as revolutionary in the same way that previous things like Photoshop, Z-Brush, Substance, and Nanite were. We hope the industry can play a clearer role in supporting artists.”
It is of course possible to train these algorithms without devouring others’ artwork without permission. Perhaps there is a world where artists are paid to train machine learning models, although I don’t know how many artists would consider that better. All other anxieties arise from the widespread use of artificial intelligence. What biases might common algorithms have, and how might they affect our perception of the world? How will schools and competitions cope with AI-laundered plagiarism?
Machine learning is being used in all sorts of other fields, from graphics technologies such as Nvidia DLSS (Opens in a new tab) for self-driving cars Nuclear fusionAnd it only gets stronger from here. Unlike the blockchain revolution we keep turning our eyes to, machine learning represents a real change in how we understand and interact with computers. This moral, legal, and philosophical quagmire is just beginning to open up: it will only get deeper and deeper from here. And our friends’ selfies will get more and more compliments.
“Internet geek. Friendly coffee trailblazer. Infuriatingly humble musicaholic. Twitter fan. Devoted alcohol aficionado. Avid thinker.”