…and why would you ask an LLM something you already know about? :)

  • themeatbridge@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    20 hours ago

    I ask for summaries and examples for things I understand well but struggle to explain. Sometimes it’s very helpful, and sometimes it’s just deranged nonsense.

    That’s why I’m less likely to ask it to about something I don’t already know. How would I know if the answer is accurate or coherent? At least with something like Wikipedia, I can track down a source and look for foundational truth, even if it is hidden under layers of bias.

    • A_Union_of_Kobolds@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      19 hours ago

      First things I asked Ollama after installing it were questions about permaculture and BattleTech.

      The BT responses weren’t great and I’m not a botanist…

  • tory@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    15 hours ago

    I’ve begun asking, “Did you just make that up” before I share anything. A fair amount of the time it’s like: “You’re right to be skeptical, this doesn’t seem correct. Let’s reevaluate.” Or whatever.

    • deadcade@lemmy.deadca.de
      link
      fedilink
      arrow-up
      1
      ·
      8 hours ago

      It’s still an LLM, not a “truth machine”. Replying with “did you make that up” will just cause it to respond with the next most likely tokens.

      Try this: if you know it’s saying something factual, try your technique. It will likely “correct” itself by slightly rephrasing. Enough rephrasing might change the meaning of the sentence, but there’s nothing checking whether that’s factual before or after.

      I’ve had some LLMs become extremely stubborn, and deny that it’s wrong on basic facts like the release year of certain media.