• Aggravationstation@feddit.uk
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    2 months ago

    The year is 1997. A young boy is about to watch a porn video for the first time on a grainy VHS tape. An older version of himself from the far off year of 2025 appears.

    Me: “You know, in the future, you’ll make your own porn videos.”

    90s me: “Wow, you mean I’ll get to have lots of hot sex!?!?”

    Me: “Ha! No. So Nvidia will release this system called CUDA…”

    • CodexArcanum@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 months ago

      I thought this was going to go Watchmen for a moment. Like…

      It is 1997, I am a young boy, I am jerking off to a grainy porno playing over stolen cinemax.

      It is 2007, i am in my dorm, i am jerking off to a forum thread full of hi-res porno.

      It is 2027, i am jerking off to an ai porno stream that mutates to my desires in real time. I am about to nut so hard that it shatters my perception of time.

      • Zron@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        The AI camera will still zoom in on the guys nuts as you’re about to bust tho.

      • The Pantser@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Oops the stream hallucinated and mutated into a horror show with women with nipples that are mouths and dicks with eyeballs.

        • Bgugi@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          I cannot for the life of me find it, but there was an article before the modern explosion of ML into the public consciousness where a guy found that one of the big search engines offered an API to “rate” nsfw images on a numeric scale. The idea was that site admins could use it to automatically filter content with a bit more granularity.

          So naturally, he trained a naive image generating model with the nsfw score as the sole metric of success, and got results much like you describe. Results were pretty much “what if Dali and kronenberg had a baby with slaanesh?”

        • CodexArcanum@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          I figured rule of threes meant it was funnier to leave it out. 2017 would have been sad gooning to pornhub during the first trump nightmare.

          Then 2027 could be sad gooning to ai hyperporn during the second trump nightmare.

          Maybe I should have used 20 year jumps, but "2037, I am jerking off because there’s no food, and the internet is nothing but ai porn.’ didn’t seem as funny a point for the “time shattering” bit.

    • turnip@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      2 months ago

      Then another company called Deepseek will release a system called low level programming that replaces CUDA.

  • Jo Miran@lemmy.ml
    link
    fedilink
    English
    arrow-up
    15
    ·
    2 months ago

    I am going to make this statement openly on the Internet. Feel free to make AI generated porn of me as long as it involves adults. Nobody is going to believe that a video of me getting railed by a pink wolf furry is real. Everyone knows I’m not that lucky.

    • Technus@lemmy.zip
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 months ago

      Fortunately, most of my family is so tech illiterate that even if a real video got out, I could just tell them it’s a deepfake and they’d probably believe me.

    • sunzu2@thebrainbin.org
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      2 months ago

      Yeah after certain point this shit doesn’t really do anything… But younger people and mentally vulnerable people gonna be rocked by this tech lol

  • Dr. Moose@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 months ago

    Good.

    Hot take but I think AI porn will be revolutionary and mostly in a good way. Sex industry is extremely wasteful and inefficient use of our collective time that also often involves a lot of abuse and dark business practices. It’s just somehow taboo to even mention this.

      • UltraGiGaGigantic@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Same with photoshop. I wish people didn’t do bad things with tools also, but we can’t let the worst of us all to define the world we live in.

    • JackFrostNCola@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      Sometimes you come across a video and you are like ‘oh this. I need more of THIS.’
      And then you start tailoring searches to try find more of the same but you keep getting generic or repeated results because the lack of well described/defined content overuse of video TAGs (obviously to try get more views with a large net rather than being specific).
      But would i watch AI content if i could feed it a description of what i want? Hell yeah!

      I mean there are only so many videos of girls giving a blowjob while he eats tacos and watches old Black & White documentaries about the advancements of mechanical production processes.

      • DUMBASS@leminal.space
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        I hate it when you find a good video, it does the job really well, so a few days later you’re like, yeah let’s watch that one again, you type in every single description you can about the video and find everything else but the video you want and they’re barely anywhere near as good.

        Hell I’d take an AI porn search engine for now, let me describe in detail what the video I’m looking for is so I can finally see it again.

        • Scrollone@feddit.it
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          Always, always download your favourite videos. Today they’re available, tomorrow you don’t know. (Remember the great PornHub purge? Pepperidge Farms remember)

      • Appoxo@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Not to mention the users that may have a specific interest in some topic/action and basically all types of potential sources are locked bwhind paywalls.

        • Eugene V. Debs' Ghost@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          Can confirm with my fetish. Some great artists and live actors who do it, but 90% of the content for it online is bad MS Paint level edits and horrid acting with props. That 10%? God tier, the community showers them in praise and commissions and only stop when they want to, unless a payment service like Visa or Patreon censors them and their livelihood as consenting adults.

    • ZILtoid1991@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      Yes, but will also hurt the very workers’ bottom line, and with some clever workarounds, it’ll be used to fabricate defamatory material other methods were not good at.

      • SoftestSapphic@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        2 months ago

        Good, the sooner this system fails to provide for more people the sooner we form mobs and rid ourselvs of it.

        The human race needs to progress to our next evolutionary step to a post scarcity approach to world economies

        • ZILtoid1991@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          AI won’t bring us a post scarcity world, but one with the upmost surveillance and “art” no longer made by artists.

          • SoftestSapphic@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            2 months ago

            We already live in a post scarcity world, we produce more than enough needs and goods for every person alive, we just throw away more than half of all food and clothing produced instead of giving it to the hungry because it’s isn’t profitable.

  • TachyonTele@lemm.ee
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    2 months ago

    Who are the girls in the picture? We can do this, team. Left to right, starting at the top.

    1. Gwen Stacy

    2. ??

    3. bayonet

    4. little mermaid

    5. ??

    6. ??

    7. Jinx

    8. ??

    9. Rei

    10. Rei

    11. lol Rei

    12. Aerith

        • MeekerThanBeaker@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 months ago

          You shouldn’t judge someone for having a grass fetish. They have wants and needs like anyone else.

          When the lawn gets cut, which activates its distress signal with that sweet fresh fragrance… OP can’t help but get off on that.

        • Dsklnsadog@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          2
          ·
          2 months ago

          You must be fun because you love pornstars! I don’t so I’m boring! I’m going for some fun grass now. Have a good one.

        • Dsklnsadog@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 months ago

          I usually don’t relay in pornstars, and if I do, I don’t repeat or remember their face. But I guess there is a big market of pro-fappers in Lemmy.

            • L0rdMathias@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              Based on their response habits, it’s likely a poorly made AI or a 13 year old kid. Not worth interacting with it cuz either: it is incable or caring, or they really aren’t supposed to be here and we really shouldn’t welcome children into adult spaces by allowing them into the conversation. If they don’t wanna have discussion, then why would they contribute to conversation when spoken to?

            • Dsklnsadog@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              3
              ·
              2 months ago

              Oh, I’m sorry, I didn’t know. That perfectly proves my point that some people need to touch some grass, but anyway, I’m not here to judge anyone’s life, it was just a little piece of advice. That’s all for me. Have a good one.

              • TachyonTele@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                2 months ago

                Let me guess, your parents don’t let you watch movies or play video games so you have and issues towards adults that have done so their entire lives. That’s too bad.

                • Dsklnsadog@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  arrow-down
                  1
                  ·
                  2 months ago

                  Oh, you wanna talk about me? My parents bought a chipped PS1 when I was a kid and love it. Now I have a PS5 (used to play AC, GTA, and FIFA, but I got bored, so it became a “youtube frontend”). I had a cable TV and a PC with Windows 95/98/XP/Vista (acording time) and later Ubuntu. Great childhood, but I also had friends, used to play football every single week day (the one with the foot in the ball). Now I only play once a week in my local team (But I do it really bad, so I play defense). Anything more you wanna know? I’m all for sharing! =) I know, i know, too much about me. Have a great day. And next time just ignore or take the advice. If you feel I’m trolling, just don’t feed me. Bye!!

  • Ulrich@feddit.org
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    2 months ago

    Oh my God! That’s disgusting! AI porn online!? Where!? Where do they post those!? There’s so many of them, though. Which one?

    • Regrettable_incident@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Yeah, but I’m sure it’ll improve in time. I can’t really see what’s the point of AI porn though, unless it’s to make images of someone who doesn’t consent, which is shit. For everything else, regular porn has it covered and looks better too.

      • rabber@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        2 months ago

        Porn industry destroys women’s lives so if AI becomes indistinguishable then women don’t need to sell their bodies in that way anymore

          • rabber@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            2 months ago

            You want me to find a citation explaining why women selling their bodies is bad?

            • bigb@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 months ago

              Somehow you’re both partially wrong. There are people who have been badly abused by the porn industry. There’s always a need to make sure people are safe. But there’s nothing wrong if someone wants to willingly perform in pornography.

              • Vatowine@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                2 months ago

                I would argue in terms of being worn out (joints and stuff), construction is definitely harder on the body. How many regret that they can barely walk in their 50s 60s?

              • rabber@lemmy.ca
                link
                fedilink
                English
                arrow-up
                0
                ·
                2 months ago

                There aren’t very many pornstars who don’t regret it. But you can find countless examples of regret.

                • Nalivai@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  2 months ago

                  But it’s mostly because of you people. You make their lifes miserable by pointless moralisation. You are the reason the industry is full of shady monsters, you made it that way with your constant religious fewer.

            • win95@lemmy.zip
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 months ago

              I mean they still have their body after shooting porn so they’re not really selling their body more than let’s say a construction worker

              • rabber@lemmy.ca
                link
                fedilink
                English
                arrow-up
                0
                arrow-down
                1
                ·
                2 months ago

                Except those jobs don’t ruin your dignity and reputation etc

            • gamer@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 months ago

              Why are you assuming that it is? Maybe it’s because I’m not a religious person, but I don’t see anything morally wrong with sex work. Whether someone is doing it against their will is a separate issue, but that’s not an assumption I’d make without other evidence.

              If you really are coming at this issue from a religious point of view, then there’s no point getting into a discussion here since I’m not going to change your mind on that (nor do I care to; believe what you want). Otherwise, I’m curious what your actual arguments might be.

      • Ibaudia@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        It’s pleasing to the eye but lacks the soul and passion of a real, human gooner who just wants to make people cum

  • MoonlightFox@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    2 months ago

    First off, I am sex positive, pro porn, pro sex work, and don’t believe sex work should be shameful, and that there is nothing inherently wrong about buying intimacy from a willing seller.

    That said. The current state of the industry and the conditions for many professionals raises serious ethical issues. Coercion being the biggest issue.

    I am torn about AI porn. On one hand it can produce porn without suffering, on the other hand it might be trained on other peoples work and take peoples jobs.

    I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can’t logically argue for it being illegal without a victim.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      2 months ago

      I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can’t logically argue for it being illegal without a victim.

      I’ve been thinking about this recently too, and I have similar feelings.

      I’m just gonna come out and say it without beating around the bush: what is the law’s position on AI-generated child porn?

      More importantly, what should it be?

      It probably goes without saying that the training data absolutely should not contain CP, for reasons that should be obvious to anybody. But what if it wasn’t?

      If we’re basing the law on pragmatism rather than emotional reaction, I guess it comes down to whether creating this material would embolden paedophiles and lead to more predatory behaviour, or whether it would satisfy their desires enough to cause a substantial drop in predatory behaviour.

      And to know that, we’d need extensive and extremely controversial studies. Beyond that, even in the event allowing this stuff to be generated is an overall positive (and I don’t know whether it would or won’t), will many politicians actually call for this stuff to be allowed? Seems like the kind of thing that could ruin a political career. Nobody’s touching that with a ten foot pole.

      • Nighed@feddit.uk
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 months ago

        I think the concern is that although it’s victimless, if it’s legal it could… Normalise (within certain circles) the practice. This might make the users more confident to do something that does create a victim.

        Additionally, how do you tell if it’s really or generated? If AI does get better, how do you tell?

      • General_Effort@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        what is the law’s position on AI-generated child porn?

        Pretend underage porn is illegal in the EU and some other countries. I believe, in the US it is protected by the first amendment.

        Mind that when people talk about child porn or CSAM that means anything underage, as far as politics is concerned. When two 17-year-olds exchange nude selfies, that is child porn. There were some publicized cases of teens in the US being convicted as pedophile sex offenders for sexting.

      • WeirdGoesPro@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        It’s so much simpler than that—it can be created now, so it will be. They will use narrative twists to post it on the clearnet, just like they do with anime (she’s really a 1000 year old vampire, etc.). Creating laws to allow it are simply setting the rules of the phenomenon that is already going to be happening.

        The only question is whether or not politicians will stop mud slinging long enough to have an adult conversation, or will we just shove everything into the more obscure parts of the internet and let it police itself.

      • michaelmrose@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        2 months ago

        Let’s play devils advocate. You find Bob the pedophile with pictures depicting horrible things. 2 things are true.

        1. Although you can’t necessarily help Bob you can lock him up preventing him from doing harm and permanently brand him as a dangerous person making it less likely for actual children to be harmed.

        2. Bob can’t claim actual depictions of abuse are AI generated and force you to find the unknown victim before you can lock him and his confederates up. If the law doesn’t distinguish between simulated and actual abuse then in both cases Bob just goes to jail.

        A third factor is that this technology and the inherent lack of privacy on the internet could potentially pinpoint numerous unknown pedophiles who can even if they haven’t done any harm yet be profitably persecuted to societies ultimate profit so long as you value innocent kids more than perverts.

        • shalafi@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          Am I reading this right? You’re for prosecuting people who have broken no laws?

          I’ll add this; I have sexual fantasies (not involving children) that would be repugnant to me IRL. Should I be in jail for having those fantasies, even though I would never act on them?

          This sounds like some Minority Report hellscape society.

          • Clent@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            Correct. This quickly approaches thought crime.

            What about an AI gen of a violent rape and murder. Shouldn’t that also be illegal.

            But we have movies that have protected that sort of thing for years; graphically. Do those the become illegal after the fact?

            And we also have movies of children being victimized so do these likewise become illegal?


            We already have studies that show watching violence does not make one violent and while some refuse to accept that, it is well established science.

            There is no reason to believe the same isn’t true for watching sexual assault. There are been many many movies that contain such scenes.

            But ultimately the issue will become that there is no way to prevent it. The hardware to generate this stuff is already in our pockets. It may not be efficient but it’s possible and efficiency will increase.

            The prompts to generate this stuff are easily shared and there is no way to stop that without monitoring all communication and even then I’m sure work around would occur.

            Prohibition requires society sacrifice freedoms and we have to decide what weee willing to sacrifice here because as we’ve seen with or prohibitions, once we unleash the law on one, it can be impossible to undo.

            • michaelmrose@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 months ago

              Ok watch adult porn then watch a movie in which women or children are abused. Note how the abuse is in no way sexualized exactly opposite of porn. It often likely takes place off screen and when rape in general appears on screen between zero and no nudity co-occurs. For children it basically always happens off screen.

              Simulated child abuse has been federally illegal for ~20 years in the US and we appear to have very little trouble telling the difference between prosecuting pedos and cinema even whilst we have struggled enough with sexuality in general.

              But ultimately the issue will become that there is no way to prevent it.

              This argument works well enough for actual child porn. We certainly don’t catch it all but every prosecution takes one more pedo off the streets. The net effect is positive. We don’t catch most car thieves either and nobody suggests we legalize car theft.

          • michaelmrose@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            Am I reading this right? You’re for prosecuting people who have broken no laws?

            No I’m for making it against the law to simulate pedophile shit as the net effect is fewer abused kids than if such images were to be legal

            • Petter1@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 months ago

              Lol, how can you say that do confidently? How would you know that with fewer AI CP you get less abused kids? And what is the logic behind it?

              Demand doesn’t really drop if something is illegal (same goes for drugs). The only thing you reduce is offering, which just resulting in making the thing that got illegal more valuable (this wakes attention of shady money grabbers that hate regulation / give a shit about law enforcement and therefore do illegal stuff to get money) and that you have to pay a shitton of government money maintaining all the prisons.

              • michaelmrose@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                2 months ago

                Basically every pedo in prison is one who isn’t abusing kids. Every pedo on a list is one who won’t be left alone with a young family member. Actually reducing AI CP doesn’t actually by itself do anything.

                • AwesomeLowlander@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  2 months ago

                  Wrong. Every pedo in prison is one WHO HAS ALREADY ABUSED A CHILD, whether directly or indirectly. There is an argument to be made, and some studies that show, that dealing with Minor Attracted People before they cross the line can be effective. Unfortunately, to do this we need to be able to have a logical and civil conversation about the topic, and the current political climate does not allow for that conversation to be had. The consequence is that preventable crimes are not being prevented, and more children are suffering for it in the long run.

        • MoonlightFox@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 months ago

          Good arguments. I think I am convinced that both cases should be illegal.

          If the pictures are real they probably increase demand, which is harmful. If the person knew, then the action therefore should result in jail and forced therapy.

          If the pictures are not, forced therapy is probably the best option.

          So I guess it being illegal in most cases simply to force therapy is the way to go. Even if it in one case is “victimless”. If they don’t manage to plausibly seem rehabilitated by professionals, then jail time for them.

          I would assume (but don’t really know) most pedophiles don’t truly want to act on it, and don’t want to have those urges. And would voluntarily go to therapy.

          Which is why I am convinced prevention is the way to go. Not sacrificing privacy. In Norway we have anonymous ways for pedophiles to seek help. There have been posters and ads for it a lot of places a year back or something. I have not researched how it works in practice though.

          Edit: I don’t think the therapy we have in Norway is conversion therapy. It’s about minimizing risk and helping deal with the underlying causes, medication, childhood trauma etc. I am not necessarily convinced that conversion therapy works.

          • foggenbooty@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            Therapy is well and good and I think we need far more avenues available for people to get help (for all issues). That said, sexuality and attraction are complicated.

            Let me start by saying I am not trying to state there is a 1:1 equivalence, this is just a comparison, but we have long abandoned conversion therapy for homosexuals, because we’ve found these preferences are core to them and not easily overwritten. The same is true for me as a straight person, I don’t think therapy would help me find men attractive. I have to imagine the same is true for pedophiles.

            The question is, if AI can produce pornography that can satisfy the urges of someone with pedophilia without harming any minors, is that a net positive? Remember the attraction is not the crime, it’s the actions that harm others that are. Therapy should always be on the table.

            This is a tricky subject because we don’t want to become thought police, so all our laws are built in that manner. However there are big exceptions for sexual crimes due to the gravity of their impact on society. It’s very hard to “stand up” for pedophilia because if acted upon it has monstrous effects, but AI is making us open this can of worms that I don’t belive we ever really thought through besides criminalizing and demonizing (which could be argued was the correct approach with the technology at the time).

            • ubergeek@lemmy.today
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              2 months ago

              That said, sexuality and attraction are complicated.

              There’s nothing particularly complicated about it not being ok to rape kids, or to distribute depictions of kids being raped.

    • KillingTimeItself@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      i have no problem with ai porn assuming it’s not based on any real identities, i think that should be considered identity theft or impersonation or something.

      Outside of that, it’s more complicated, but i don’t think it’s a net negative, people will still thrive in the porn industry, it’s been around since it’s been possible, i don’t see why it wouldn’t continue.

      • Dr. Moose@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Identity theft only makes sense for businesses. I can sketch naked Johny Depp in my sketchbook and do whatever I want with it and no one can stop me. Why should an AI tool be any different if distribution is not involved?

        • KillingTimeItself@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          2 months ago

          revenge porn, simple as. Creating fake revenge porn of real people is still to some degree revenge porn, and i would argue stealing someones identity/impersonation.

          To be clear, you’re example is a sketch of johnny depp, i’m talking about a video of a person that resembles the likeness of another person, where the entire video is manufactured. Those are fundamentally, two different things.

            • KillingTimeItself@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              sort of. There are arguments that private ownership of these videos is also weird and shitty, however i think impersonation and identity theft are going to the two most broadly applicable instances of relevant law here. Otherwise i can see issues cropping up.

              Other people do not have any inherent rights to your likeness, you should not simply be able to pretend to be someone else. That’s considered identity theft/fraud when we do it with legally identifying papers, it’s a similar case here i think.

              • Dr. Moose@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                2 months ago

                But the thing is it’s not a relevant law here at all as nothing is being distributed and no one is being harmed. Would you say the same thing if AI is not involved? Sure it can be creepy and weird and whatnot but it’s not inhertly harmful or at least it’s not obvious how it would be.

                • KillingTimeItself@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  2 months ago

                  the only perceivable reason to create these videos is either for private consumption, in which case, who gives a fuck. Or for public distribution, otherwise you wouldn’t create them. And you’d have to be a bit of a weird breed to create AI porn of specific people for private consumption.

                  If AI isn’t involved, the same general principles would apply, except it might include more people now.

      • ubergeek@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        i have no problem with ai porn assuming it’s not based on any real identities

        With any model in use, currently, that is impossible to meet. All models are trained on real images.

        • KillingTimeItself@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          With any model in use, currently, that is impossible to meet. All models are trained on real images.

          yes but if i go to thispersondoesnotexist.com and generate a random person, is that going to resemble the likeness of any given real person close enough to perceptibly be them?

          You are literally using the schizo argument right now. “If an artists creates a piece depicting no specific person, but his understanding of persons is based inherently on the facial structures of other people that he knows and recognizes, therefore he must be stealing their likeness”

          • ubergeek@lemmy.today
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            No, the problem is a lack of consent of the person being used.

            And now, being used to generate depictions of rape and CSAM.

            • KillingTimeItself@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 months ago

              yeah but like, legally, is this even a valid argument? Sure there is techically probably like 0.0001% of the average person being used in any given result of an AI generated image. I don’t think that gives anyone explicit rights to that portion however.

              That’s like arguing that a photographer who captured you in a random photo in public that became super famous is now required to pay you royalties for being in that image, even though you are literally just a random fucking person.

              You can argue about consent all you want, but at the end of the day if you’re posting images of yourself online, you are consenting to other people looking at them, at a minimum. Arguably implicitly consenting to other people being able to use those images. (because you can’t stop people from doing that, except for copyright, but that’s not very strict in most cases)

              And now, being used to generate depictions of rape and CSAM.

              i dont see how this is even relevant, unless the person in question is a minor, a victim, or becoming a victim, otherwise it’s no different than me editing an image of someone to make it look like they got shot in the face. Is that shitty? Sure. But i don’t know of any laws that prevent you from doing that, unless it’s explicitly to do with something like blackmail, extortion, or harassment.

    • frezik@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      I’ve found that there’s a lot of things on the Internet that went wrong because it was ad supported for “free”. Porn is one of them.

      There is ethically produced porn out there, but you’re going to have to pay for it. Incidentally, it also tends to be better porn overall. The versions of the videos they put up on tube sites are usually cut down, and are only part of their complete library. Up through 2012 or so, the tube sites were mostly pirated content, but then they came to an agreement with the mainstream porn industry. Now it’s mostly the studios putting up their own content (plus independent, verified creators), and anything pirated gets taken down fast.

      Anyway, sites like Crash Pad Series, Erika Lust, Vanessa Cliff, and Adulttime (the most mainstream of this list) are worth a subscription fee.

    • froggycar360@slrpnk.net
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Whats illegal in real porn should be illegal in AI porn, since eventually we won’t know whether it’s AI

      • jacksilver@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        That’s the same as saying we shouldn’t be able to make videos with murder in them because there is no way to tell if they’re real or not.

          • jacksilver@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            I mean, a lot of TV has murders in it. There is a huge market for showing realistic murder.

            But I get the feeling your saying that there isn’t a huge market for showing real people dying realistically without their permission. But that’s more a technicality. The question is, is the content or the production of the content illegal. If it’s not a real person, who is the victim of the crime.

            • froggycar360@slrpnk.net
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 months ago

              Yeah the latter. Also murder in films for the most part of storytelling. It’s not murder simulations for serial killers to get off to, you know what I mean?

      • MoonlightFox@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        It can generate combinations of things that it is not trained on, so not necessarily a victim. But of course there might be something in there, I won’t deny that.

        However the act of generating something does not create a new victim unless there is someones likeness and it is shared? Or is there something ethical here, that I am missing?

        (Yes, all current AI is basically collective piracy of everyones IP, but besides that)

        • surewhynotlem@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          2 months ago

          Watching videos of rape doesn’t create a new victim. But we consider it additional abuse of an existing victim.

          So take that video and modify it a bit. Color correct or something. That’s still abuse, right?

          So the question is, at what point in modifying the video does it become not abuse? When you can’t recognize the person? But I think simply blurring the face wouldn’t suffice. So when?

          That’s the gray area. AI is trained on images of abuse (we know it’s in there somewhere). So at what point can we say the modified images are okay because the abused person has been removed enough from the data?

          I can’t make that call. And because I can’t make that call, I can’t support the concept.

          • KairuByte@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            I mean, there’s another side to this.

            Assume you have exacting control of training data. You give it consensual sexual play, including rough play, bdsm play, and cnc play. We are 100% certain the content is consensual in this hypothetical.

            Is the output a grey area, even if it seems like real rape?

            Now another hypothetical. A person closes their eyes and imagines raping someone. “Real” rape. Is that a grey area?

            Let’s build on that. Let’s say this person is a talented artist, and they draw out their imagined rape scene, which we are 100% certain is a non-consensual scene imagined by the artist. Is this a grey area?

            We can build on that further. What if they take the time to animate this scene? Is that a grey area?

            When does the above cross into a problem? Is it the AI making something that seems like rape but is built on consensual content? The thought of a person imagining a real rape? The putting of that thought onto a still image? The animating?

            Or is it none of them?

            • Clent@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 months ago

              We already allow simulated rape in tv and movies. AI simply allows a more graphical portrayal.

          • dubyakay@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            It’s not just AI that can create content like that though. 3d artists have been making victimless rape slop of your vidya waifu for well over a decade now.

            • surewhynotlem@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              2 months ago

              Yeah, I’m ok with that.

              AI doesn’t create, it modifies. You might argue that humans are the same, but I think that’d be a dismal view of human creativity. But then we’re getting weirdly philosophical.

          • MoonlightFox@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            I see the issue with how much of a crime is enough for it to be okay, and the gray area. I can’t make that call either, but I kinda disagree with the black and white conclusion. I don’t need something to be perfectly ethical, few things are. I do however want to act in a ethical manner, and strive to be better.

            Where do you draw the line? It sounds like you mean no AI can be used in any cases, unless all the material has been carefully vetted?

            I highly doubt there isn’t illegal content in most AI models of any size by big tech.

            I am not sure where I draw the line, but I do want to use AI services, but not for porn though.