• EldritchFeminity@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    See, I agree with pretty much everything you say here, because my (and the artists who are opposed to AI) problem is with Capitalism, full stop. People have to monetize skills in order to survive if they want to spend their time doing something they love. Even hobbies have now become “side hustles.” Many of the indie game studios start out as a hobby, before the people working on the game use their savings to move to developing full-time as their job so they can work on their passion more. And then they don’t turn a profit and have to go back to making smaller projects as hobbies while they do something else for work. This is where the fear is - artists love making art, but if you do it professionally, AI that mimics your art is basically on the same level as knock-off products of name brand designs.

    My issue with MidJourney, for example, wouldn’t be an issue if the concern over taking business away from artists was made moot. You say that a professional career can’t be reduced down to a style, but then what is MidJourney doing and what is the difference? Because 4,000 of the 5,000 “style” prompts that you can input into MidJourney are artists’ names, and that list is growing apparently according to the Discord logs - somebody mentioned having a list of 15,000 new artists’ names to add to the prompts after they scrape their art. You can say “make me an impressionist landscape,” but you can also say “make me a landscape by Sara Winters.” Would having MidJourney make you paintings by a specific artist and then selling them be okay? Is that just a style or is it copyright infringement? Because I can easily see the case where that could be considered damaging to Sara’s (in this example) “market” as a professional, even if you aren’t selling the paintings you make. Because MidJourney is explicitly providing the tools to create work that mimics her art with the intent of cutting her out of the equation. At that point, have we crossed the line into forgery?

    We unfortunately live in a capitalist society and we have to keep that in mind. People need to market their skills as a job in order to afford the essentials to live. Beyond a certain point, the time investment in something demands that you make money doing it. AI as a tool has the capability to be absolutely monumentally helpful, we could even see a fundamental paradigm shift in how we think of the act of creativity. But it also has the possibility to be monstrously harmful, as we’ve already seen with faked nudes of underage teens and false endorsements for products and political campaigns. Somebody tried to threaten an artist by claiming they had the copyright to a picture the artist was working on on Twitch after they took a screenshot of it and supposedly ran it through some sort of image generator. There was even a DA who somebody tried to scam using an AI generated copy of his son’s voice claiming that he was in prison. Letting it be unregulated is incredibly risky, and that goes for corporate AI uses as well. We need to be able to protect us from them as much as we need to be able to protect ourselves from bad actors. And part of that is saying what is and what isn’t an acceptable use of AI and the data that goes into training it. Otherwise, people are going to use stuff like Nightshade to attempt to protect their livelihoods from a threat that may or may not be imagined.

    • Even_Adder@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      11 months ago

      My issue with MidJourney, for example, wouldn’t be an issue if the concern over taking business away from artists was made moot. You say that a professional career can’t be reduced down to a style, but then what is MidJourney doing and what is the difference? Because 4,000 of the 5,000 “style” prompts that you can input into MidJourney are artists’ names, and that list is growing apparently according to the Discord logs - somebody mentioned having a list of 15,000 new artists’ names to add to the prompts after they scrape their art. You can say “make me an impressionist landscape,” but you can also say “make me a landscape by Sara Winters.” Would having MidJourney make you paintings by a specific artist and then selling them be okay? Is that just a style or is it copyright infringement? Because I can easily see the case where that could be considered damaging to Sara’s (in this example) “market” as a professional, even if you aren’t selling the paintings you make. Because MidJourney is explicitly providing the tools to create work that mimics her art with the intent of cutting her out of the equation. At that point, have we crossed the line into forgery?

      You should read the article I linked earlier. There’s no problem as long as you’re not using their name to sell your works. Styles belong to everyone, no one person can lay claim to them.

      Specific expressions deserve protection, but wanting to limit others from expressing the same ideas differently is both is selfish and harmful, especially when they aren’t directly copying or undermining your work.

      We unfortunately live in a capitalist society and we have to keep that in mind. People need to market their skills as a job in order to afford the essentials to live. Beyond a certain point, the time investment in something demands that you make money doing it. AI as a tool has the capability to be absolutely monumentally helpful, we could even see a fundamental paradigm shift in how we think of the act of creativity. But it also has the possibility to be monstrously harmful, as we’ve already seen with faked nudes of underage teens and false endorsements for products and political campaigns. Somebody tried to threaten an artist by claiming they had the copyright to a picture the artist was working on on Twitch after they took a screenshot of it and supposedly ran it through some sort of image generator. There was even a DA who somebody tried to scam using an AI generated copy of his son’s voice claiming that he was in prison. Letting it be unregulated is incredibly risky, and that goes for corporate AI uses as well. We need to be able to protect us from them as much as we need to be able to protect ourselves from bad actors. And part of that is saying what is and what isn’t an acceptable use of AI and the data that goes into training it. Otherwise, people are going to use stuff like Nightshade to attempt to protect their livelihoods from a threat that may or may not be imagined.

      We already have countless laws for the misuse of computer systems, and they adequately cover these cases. I’m confident we’ll be able to deal with all of that and reap the benefits.

      Open-source AI development offers critical solutions. By making AI accessible, we maximize public participation and understanding, foster responsible development, and prevent harmful control attempts. Their AI will never work for us, and look at just who is trying their hand at regulatory capture. I believe John Carmack put it best.