Emotion artificial intelligence uses biological signals such as vocal tone, facial expressions and data from wearable devices as well as text and how people use their computers, to detect and predict how someone is feeling. It can be used in the workplace, for hiring, etc. Loss of privacy is just the beginning. Workers are worried about biased AI and the need to perform the ‘right’ expressions and body language for the algorithms.
Did you use an LLM to write this? Kinda ironic, don’t you think?
I spent the better half of 45 minutes writing and revising my comment. So thank you sincerely for the praise, since English is not my first language.
If you wrote this yourself, that’s even more ironic, because you used the same format that ChatGPT likes to spit out. Humans influence ChatGPT -> ChatGPT influences humans. Everything’s come full circle.
I ask though because on your profile you’ve used ChatGPT to write comments before.