How do we have people wasting their time arguing about software having feelings when we haven’t even managed to convince the majority of people that fish and crabs and stuff can feel pain even though they don’t make a frowny face when you hurt them.
That’s easy, it’s because LLM output is a reasonable simulation of sounding like a person. Fooling people’s consciousness detector is just about their whole thing at this point.
Crabs should look into learning to recite the pledge of allegiance in the style of Lady GaGa.
Hoo boy. The original person being reposted continues on their original post that they believe we cannot be certain that genAI does not have feelings.
Just complete the delusional circuit and tell them you can’t be sure they aren’t an AI, ask them how they would prove they aren’t.
How do we have people wasting their time arguing about software having feelings when we haven’t even managed to convince the majority of people that fish and crabs and stuff can feel pain even though they don’t make a frowny face when you hurt them.
That’s easy, it’s because LLM output is a reasonable simulation of sounding like a person. Fooling people’s consciousness detector is just about their whole thing at this point.
Crabs should look into learning to recite the pledge of allegiance in the style of Lady GaGa.