• Umbrias@beehaw.org
    link
    fedilink
    arrow-up
    2
    ·
    11 months ago

    In all reality this tokenization and llm language processing is useful for all sorts of things which can be mathematically somewhat modeled similarly to language. Using them for shitty web searches is not ideal.