Hey there, sometimes I see people say that AI art is stealing real artists’ work, but I also saw someone say that AI doesn’t steal anything, does anyone know for sure? Also here’s a twitter thread by Marxist twitter user ‘Professional hog groomer’ talking about AI art: https://x.com/bidetmarxman/status/1905354832774324356

  • amemorablename@lemmygrad.ml
    link
    fedilink
    arrow-up
    0
    ·
    12 days ago

    It’s a multifaceted thing. I’m going to refer to it as image generation, or image gen, cause I find that’s more technically accurate that “art” and doesn’t imply some kind of connotation of artistic merit that isn’t earned.

    Is it “stealing”? Image gen models have typically been trained on a huge amount of image data, in order for the model to learn concepts and be able to generalize. Whether because of the logistics of getting permission, a lack of desire to ask, or a fear that permission would not be given and the projects wouldn’t be able to get off the ground, I don’t know, but many AI models, image and text, have been trained in part on copyrighted material that they didn’t get permission to train on. This is usually where the accusation of stealing comes in, especially in cases where, for example, an image gen model can almost identically reproduce an artist’s style from start to finish.

    On a technical level, the model is generally not going to be reproducing exact things exactly and don’t have any human-readable internal record of an exact thing, like you might find in a text file. They can imitate and if overtrained on something, they might produce it so similarly that it seems like a copy, but some people get confused and think this means models have a “database” of images in them (they don’t).

    Now whether this changes anything as to “stealing” or not, I’m not taking a strong stance on here. If you consider it as something where makers of AI should be getting permission first, then obviously some are violating that. If you only consider it as something where it’s theft if an artist’s style can reproduced to the extent they aren’t needed to make stuff highly similar to what they make, some models are also going to be a problem in that way. But this is also getting into…

    What’s it really about? I cannot speak concretely by the numbers, but my analysis of it is a lot of it boils down to anxiety over being replaced existentially and anxiety over being replaced economically. The second one largely seems to be a capitalism problem and didn’t start with AI, but has arguably by hypercharged by it. Where image gen is different is that it’s focused on generating an entire image from start to finish. This is different from tools like drawing a square in an image illustrator program where it can help you with components of drawing, but you’re still having to do most of the work. It means someone who understands little to nothing about the craft can prompt a model to make something roughly like what they want (if the model is good enough).

    Naturally, this is a concern from the standpoint of ventures trying to either drastically reduce number of artists, or replace them entirely.

    Then there is the existential part and this I think is a deeper question about generative AI that has no easy answer, but once again, is something art has been contending with for some time because of capitalism and now has to confront much more drastically in the face of AI. Art can be propaganda, it can be culture and passing down stories (Hula dance), or as is commonly said in the western context in my experience, it can be a form of self expression. Capitalism has long been watering down “art” into as much money-making formula as possible and not caring about the “emotive” stuff that matters to people. Generative AI is, so far, the peak of that trajectory. That’s not the say the only purpose of generative AI is to degrade or devalue art, but that it seems enabling of about as “meaningless content mill” as capitalism has been able to get so far.

    It is, in other words, enabling of producing “content” that is increasingly removed from some kind of authentic human experience or messaging. What implications this can have, I’m not offering a concluding answer on. I do know one concern I’ve had and that I’ve seen some others voice, is in the cyclical nature of AI, that because it can only generalize so far beyond its dataset, it’s reproducing a particular snapshot of a culture at a particular point in time, which might make capitalistic feedback loop culture worse.

    But I leave it at that for people to think about. It’s a subject I’ve been over a lot with a number of people and I think it is worth considering with nuance.