• nondescripthandle@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    Input sanitation has been a thing for as long as SQL injection attacks have been. It just gets more intentive for llms depending on how much you’re trying to stop it from outputting.

      • nondescripthandle@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        3 months ago

        Of course because punctuation isn’t going to break a table, but the point is that it’s by no means an unforseen or unworkable problem. Anyone could have seen that coming, for example basic SQL and a college class in Java is the extent of my comp sci knowledge and I know about it.

        • MajorHavoc@programming.dev
          link
          fedilink
          arrow-up
          0
          ·
          3 months ago

          it’s by no means an unforseen or unworkable problem

          Yeah. It’s achievable, just usually not in the ways currently preferred (untrained staff spin it up and hope for the best), and not for the currently widely promised low costs (with no one trained in data science on staff at the customer site).

          For a bunch of use cases the lack of security is currently an acceptable trade off.

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        Right, it’s something like trying to get a three year old to eat their peas. It might work. It might also result in a bunch of peas on the floor.

    • InAbsentia@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      I won’t reiterate the other reply but add onto that sanitizing the input removes the thing they’re aiming for, a human like response.