It just feels too good to be true.

I’m currently using it for formatting technical texts and it’s amazing. It doesn’t generate them properly. But if I give it the bulk of the info it makes it pretty af.

Also just talking and asking for advice in the most random kinds of issues. It gives seriously good advice. But it makes me worry about whether I’m volunteering my personal problems and innermost thoughts to a company that will misuse that.

Are these concerns valid?

  • flathead@quex.cc
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    1 year ago

    Given that they know exactly who you are, I wouldn’t get too personal with anything but it is amazing for many otherwise time-consuming problems like programming. It’s also quite good at explaining concepts in math and physics and and is capable of reviewing and critiquing student solutions. The development of this tool is not miraculous or anything - it uses the same basic foundation that all machine learning does - but it’s a defining moment in terms of expanding capabilities of computer systems for regular users.

    But yeah, I wouldn’t treat it like a personal therapist, only because it’s not really designed for that, even though it can do a credible job of interacting. The original chat bot Eliza, simulated a “non directional” therapist and it was kind of amazing how people could be drawn into intimate conversations even though it was nothing like ChatGPT in terms of sophistication - it just parroted back what you asked it in a way that made it sound empathetic. https://en.wikipedia.org/wiki/ELIZA

    screen shot of eliza conversation with "simulated therapist"

    • BCsven@lemmy.ca
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Ha, I spent way too much time typing stuff into the Eliza prompt. it was amazing for the late 80s