• DarkThoughts@fedia.io
    link
    fedilink
    arrow-up
    1
    ·
    2 months ago

    The bots pose as whatever the creator wants them to pose at. People can create character cards for various platforms such as this one and the LLM with try to behave according to the contextualized description of their provided character card. Some people create “therapists” and so the LLM will write like they’re a therapist. And unless the character card specifically says that they’re a chatbot / LLM / computer / “AI” / whatever they won’t say otherwise, because they don’t have any sort of self awareness of what they actually are, they just do text prediction based on the input they’ve been fed (though. It’s not really character.ai or any other LLM service or creator can really change, because this is fundamentally how LLMs work.

    • Hirom@beehaw.org
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      2 months ago

      This is why these people ask, among other things, to strictly limit access to adults.

      LLM are good with language and can be very convincing characters, especially to children and teenagers, who don’t fully understand how these things work, and who are more vulnerable emotionally.