• 0 Posts
  • 4 Comments
Joined 2 years ago
cake
Cake day: June 12th, 2023

help-circle
  • It’s been a minute since I saw it, but the root reason was that they were taking these young bodies to more or less extend their lives, maybe indefinitely. They kidnap black people because their missing reports are given a lower priority than whites, which is shown in the scene when Andre’s TSA friend (Rod?) goes to the police and they dismiss him about finding the missing black man from the beginning of the movie. Granted, his story and theory about the abduction is hard to believe, but there should’ve been some response to the guy being seen more than “he looks found to me”

    And don’t get it twisted, the white people are racist, but they weren’t skinheaded, card-carrying, hood-wearing, hard-r racists. They were a modern affluent family you could find in many suburbs. Neoliberal type racism

    And the behavior of the people who were already taken over? They wouldn’t be acting like servants normally. That’s why they seemed so strange, because they were putting on an act to get Chris taken over, and the only way they knew how to act out a black person was as a servant.



  • Well first, AI won’t end the world… in its current state. There’s plenty of sci-fi covering that exact doomsday scenario: a highly advanced artificial general intelligence (AGI), for one reason or other, decides to eradicate or drastically alter humanity as we know it, usually because it sees humans/humanity as a blight or threat (see: Skynet from Terminator, Geth from Mass Effect), as a resource to be used (see: the machines from The Matrix, Reapers from Mass Effect), or as a twisted form of protection (Ultron from Marvel comics/MCU, AUTO from WALL-E). Will something like this happen? Hopefully not, but definitely not with the “AI” we have now.

    The impact of AI now is primarily social, the tip of the iceberg being used in academia (students using ChatGPT to write essays, professors using “AI Detectors” that also flag legitimate essays as being AI generated) and the issue of art generation. The biggest impacts that I think we’re going to see become a big issue soon is with deepfakes. We’ve seen some of this come up already, with the issues of certain female online personalities having AI-generated or deepfaked nudes produced, or the fad we had for a bit with AI-produced audio of US presidents hanging out, making tier lists and playing video games. The political theater, particularly in the US, already sees a lot of misleading/out-of-context sound bites and general misinformation, and voice synthesis tech can drastically affect this. Inserting a damning line in the middle of a platform speech, creating a completely fabricated “leaked phone call”… or potentially doing the opposite and gaslighting about what was really said or claiming that said conversation was actually faked. The proliferation of voice synthesis, whether or not it gets used will negatively impact the public’s political literacy.

    Going back to the arts, we are also seeing this issue come up (at least partially) with the recent WGA/SAG-AFTRA strikes and in art communities, where a language learning model or art generator is being used to “save money” by not using a human artist. Think of all the money a company can save by eliminating the need for writers, artists, or even the background extras and replacing them with generative models.

    We may even see this have greater impacts on a personal cultural level, such as AI who will be your friend or romantic companion.

    All that to say I don’t think AI, as it is now, is all bad, but right now the potential downsides we face with just the basic “AI” we have now vastly outweighs the benefits of a text bot that writes in a way that mostly looks like it should make sense or specific art pieces. There’s a lot of bad, and the good is pretty nebulous.