ooli@lemmy.world to science@lemmy.worldEnglish · 2 days agoScientists Have Confirmed the Existence of a Third Form of Magnetismwww.popularmechanics.comexternal-linkmessage-square50fedilinkarrow-up1241arrow-down13
arrow-up1238arrow-down1external-linkScientists Have Confirmed the Existence of a Third Form of Magnetismwww.popularmechanics.comooli@lemmy.world to science@lemmy.worldEnglish · 2 days agomessage-square50fedilink
minus-squareNum10ck@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down56·1 day agochatgpt can eli5 summarize. who knows if its accurate enough
minus-squareWindex007@lemmy.worldlinkfedilinkEnglisharrow-up16·1 day agoWhat value is a summary when you fully acknowledge that you can not trust it for accuracy?
minus-squareNum10ck@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down7·1 day agoi agree, but what can you trust for accuracy in these times?
minus-squareWindex007@lemmy.worldlinkfedilinkEnglisharrow-up7arrow-down1·1 day agoPeople who are experts in the subject. Propegandists thrive by trying to convince people that they can’t trust anyone, because it makes foolish people believe that every voice carries equal merit.
minus-square7toed@midwest.sociallinkfedilinkEnglisharrow-up4·1 day ago How many Rs are in orange There are no Rs in orange Yeah I’ll be using chatGPT for education, what could go wrong
minus-squaresnooggums@lemmy.worldlinkfedilinkEnglisharrow-up20arrow-down1·edit-21 day agoChatGPT cam bullshit about anything, but odds are anything complex will be wrong. Even simple things are probably wrong.
minus-squareDarkThoughts@fedia.iolinkfedilinkarrow-up6·1 day agoI would not even trust it to summarize nuanced details in a lengthy article, let alone something science related (especially about new discoveries).
chatgpt can eli5 summarize. who knows if its accurate enough
What value is a summary when you fully acknowledge that you can not trust it for accuracy?
i agree, but what can you trust for accuracy in these times?
People who are experts in the subject.
Propegandists thrive by trying to convince people that they can’t trust anyone, because it makes foolish people believe that every voice carries equal merit.
Yeah I’ll be using chatGPT for education, what could go wrong
ChatGPT cam bullshit about anything, but odds are anything complex will be wrong.
Even simple things are probably wrong.
I would not even trust it to summarize nuanced details in a lengthy article, let alone something science related (especially about new discoveries).