1 comments

  • JohnKnopf 10 hours ago ago

    Every AI forgets you when you close the tab. Cecil is a protocol that fixes that — persistent memory, pattern detection, and identity that evolves over time. Three layers: vector store (Qdrant, local), an observer that runs post-session and detects drift between baseline and reality, and a meta agent that assembles a compressed identity window before each conversation. No LLM calls during chat. Local-first, bring your own model. I fed it 44 hours of my podcast transcripts and it learned how I think better than any profile page could. Apache 2.0.

      r/LocalLLaMA:
    
      Cecil — open source memory and identity protocol for AI (local-first, any model)
    
      Built a protocol that gives any LLM persistent memory. Not a wrapper — infrastructure. Qdrant runs locally for vector storage,
      FastEmbed for zero-cost embeddings, and you bring whatever model you want (LM Studio, Ollama, Claude, whatever).
    
      The interesting part: it has an observer that runs after sessions and detects drift. It compares what was configured against
      what the patterns actually show. Works the same whether it's observing a person, an agent, or itself.
    
      I ran 44 hours of podcast transcripts through it on an RTX 4090 with faster-whisper. The synthesis it produced was genuinely
      surprising.
    
      Apache 2.0: https://github.com/johnkf5-ops/cecil-protocol
    
      r/selfhosted:
    
      Cecil — self-hosted AI memory protocol (Qdrant + any LLM, no cloud)
    
      Open source protocol that gives AI persistent memory and identity. Everything runs locally — Qdrant in Docker for vectors,
      FastEmbed for embeddings (zero API cost), any OpenAI-compatible LLM. No data leaves your machine.
    
      It has an observer layer that detects patterns and drift over time, and a human-readable markdown mirror of everything in the
      vector database — inspect, edit, delete anything.
    
      https://github.com/johnkf5-ops/cecil-protocol