Any AI/service that can translate legal writings to a more understandable version?

  • hotdogcharmer@lemmy.zip
    link
    fedilink
    arrow-up
    2
    ·
    7 hours ago

    Being slightly wrong in a translation is bad, for sure, but doesn’t (often) invent new facts. I still would not trust it for a legal document, personally.

    I did actually originally ask what your point was in the comment I wrote, but I couldn’t phrase it in a way that didn’t feel hostile - which I hope I’m not coming across as. I just couldn’t quite grasp the point you were trying to make, and I think it’s because we disagree on a fundamental level here.

    Yes if they’re signing a contract, absolutely get a lawyer if you don’t understand what you’re signing, but occasionally you just need to look up a law or accept a eula, and it would be nice to be able to have some help reading it, even if it’s from an imperfect tool.

    I agree with the first part about signing a contract, but totally disagree with the second. If I need to look up a law, or anything at all, I would never run it through a machine that regularly invents “facts” from whole cloth, or misinterprets, while agreeing and confidently backing any implications I give it. LLMs are inherently untrustworthy, in my opinion, partly because they’re programmed to be “yes-men” who engage the user constantly in order to sell them a service, and partly because they don’t “know” anything - they just essentially scrape the web and then uncritically mash whatever they find together and return it in convincing natural language.

    I think they are dangerous to engage with at any level.

    • bjornsno@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      6 hours ago

      Totally fair disagreement to have there. I’m extremely critical of llms for many of the same reasons as you, plus environmental and economical concerns. Having said that however for summarization, simplification, and rephrasing they don’t tend to hallucinate. Exactly as you say, they don’t know anything, but they’re instructed to always answer, so when something that doesn’t match their training data comes along they hallucinate. For this kind of task though a specialized llm is actually a pretty good fit. As long as such a tool is used responsibly and carefully I don’t see why it couldn’t exist and be moderately helpful.