• Excrubulent@slrpnk.net
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    2 months ago

    The phrase “synthesised expert knowledge” is the problem here, because apparently you don’t understand that this machine has no meaningful ability to synthesise anything. It has zero fidelity.

    You’re not exposing people to expert knowledge, you’re exposing them to expert-sounding words that cannot be made accurate. Sometimes they’re right by accident, but that is not the same thing as accuracy.

    You confused what the LLM is doing for synthesis, which is something loads of people will do, and this will just lend more undue credibility to its bullshit.