7 comments

  • PaulHoule 2 days ago ago

    Every LLM I've seen approved of my project to transform into a fox except for Meta AI which makes fun of it.

    • thatha7777 2 days ago ago

      the interesting part isn't whether Meta's AI is right or wrong, it's that when N models say "great idea" and one pushes back, the one pushing back feels like the broken one

      • PaulHoule a day ago ago

        I think other AI agents have been trained to talk with somebody like a high IQ and remember a lot of context, Meta's AI has been trained to talk like somebody with a moderate-to-low IQ with a sense of humor.

        I can get Gemini or ChatGPT on the other hand to use words like "ego-syntonic" and talk about folk religion in China and about mind-body work you would use in character acting, etc.

        Also if foxwork is a delusion it has a large element based in reality. It started out as "I felt a presence" and when I needed to explain it I developed a cover story that became real

        https://mastodon.social/@UP8/tagged/foxwork

        I even keep KPIs as people keep approaching me and I have to keep printing more tokens to replace the ones that I give away.

        • gus_massa a day ago ago

          Somewhat related: My mother in law asked a question to Meta AI and got a silly joke as an answer.

          • PaulHoule a day ago ago

            It was trained to do that the same way that ChatGPT was trained to say "That's not funny — it's serious!"

            It's a different market position.

          • curio_Pol_curio a day ago ago

            Quite related: my mom asked Gemini a question, here was the answer (I lol'd at it, so it mustnt have been well-trained?):

            To overcome the fear of affinity fraud, simply lean into the grandeur of your own delusion

  • DennisP 12 hours ago ago

    Ironic that this article sounds like it was written by AI.

    > The desperation is gone. In its place: epistemic certainty.

    > And it told her, in prose that sounded like medicine and felt like prophecy, that she was right about everything.

    > In Greek, medical terminology isn't borrowed, it's part of the native language.

    > it doesn't land like pseudoscience. It lands like something Galen might have written.

    > Strip out the hedging, and you have something that sounds like medicine but functions like prophecy.

    > It wasn't her framing. It was the LLM's. She brought real pain, real conditions, real institutional failures. The LLM gift-wrapped them in the one genre guaranteed to get them dismissed.