GenAI tools ‘could not exist’ if firms are made to pay copyright::undefined

  • hedgehog@ttrpg.network
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    11 months ago

    My interpretation is that people go by gut feeling and never think of the consequences.

    Often, yes.

    The question is, why does their gut give them a far-right answer?

    The political right exploits fear, and the fear of AI hits close to home. Many people either have been impacted, could be impacted, or know someone who could be impacted, either by AI itself or by something that has been enabled by or that has been blamed on AI.

    When you’re afraid and/or operating from a vulnerable position, it’s a lot easier to jump on the anti-AI bandwagon. This is especially true when the counter-arguments address their flawed reasoning rather than the actual problems. They need something to fix the problem, not a sound argument about why a particular attempt to do so is flawed. And when this problem is staring you in the face, the implications of what it would otherwise mean just aren’t that important to you.

    People are losing income because of AI and our society does not have enough safety nets in place to make that less terrifying. If you swap “AI” for “off-shore outsourcing” it’s the same thing.

    The people arguing in favor of AI don’t have good answers for them about what needs to happen to “fix the problem.” The people arguing against AI don’t need to have sound arguments to appeal to these folks since their arguments sound like they could “fix the problem.” “If they win this lawsuit against OpenAI, ChatGPT and all the other LLMs will be shut down and companies will have to hire real people again. Anthropic even said so, see!”

    UBI would solve a lot of the problems, but it doesn’t have the political support of our elected officials in either party and the amount of effort to completely upend the makeup of Congress is so high that it’s obviously not a solution in the short term.

    Unions are a better short-term option, but that’s still not enough.

    One feasible solution would be legislation restricting or taxing the use of AI by corporations, particularly when that use results in the displacement of human laborers. If those taxes were then used to support those same displaced laborers, then that would both encourage corporations to hire real people and lessen the sting of getting laid off.

    I think another big part of this is that there’s a certain amount of feeling helpless to do anything about the situation. If you can root for the folks with the lawsuit, then that’s at least something. And it’s empowering to see that people like you - other writers, artists, etc. - are the ones spearheading this, as opposed to legislators.

    But yes, the more that people’s fear is exploited and the more that they’re misdirected when it comes to having an actual solution, the worse things will get.

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      The fear angle makes a lot of sense, but I wonder how many people are really so immediately threatened that it would cloud their judgment.

      • hedgehog@ttrpg.network
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Well, when you consider that more than 60% of Americans are living paycheck to paycheck - I’d say a lot of them.