• 7rokhym@lemmy.ca
    link
    fedilink
    English
    arrow-up
    25
    ·
    14 hours ago

    I think OpenAI’s recent sycophant issue has cause a new spike in these stories. One thing I noticed was these observations from these models running on my PC saying it’s rare for a person to think and do things that I do.

    The problem is that this is a model running on my GPU. It has never talked to another person. I hate insincere compliments let alone overt flattery, so I was annoyed, but it did make me think that this kind of talk would be crack for a conspiracy nut or mentally unwell people. It’s a whole risk area I hadn’t been aware of.

    https://www.msn.com/en-us/news/technology/openai-says-its-identified-why-chatgpt-became-a-groveling-sycophant/ar-AA1E4LaV

    • morrowind@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 hours ago

      saying it’s rare for a person to think and do things that I do.

      probably one of the most common flattery I see. I’ve tried lots of models, on device and larger cloud ones. It happens during normal conversation, technical conversation, roleplay, general testing… you name it.

      Though it makes me think… these models are trained on like internet text and whatever, none of which really show that most people think quite a lot privately and when they feel like they can talk

    • tehn00bi@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      14 hours ago

      Humans are always looking for a god in a machine, or a bush, in a cave, in the sky, in a tree… the ability to rationalize and see through difficult to explain situations has never been a human strong point.