1 comments

  • nuwansam_87 18 hours ago ago

    LLMs are deceivingly human like, which makes us often do the mistake of prompting them assuming such.

    However, there is a key difference in how they perceive world state as agents - they live in snapshots unlike us. Putting your self in their shoes helps prompt them for significantly better results.