A collection of example applications and notebooks demonstrating how to integrate and use Hindsight.
Full-featured example applications demonstrating Hindsight integration patterns:
- chat-memory - Conversational AI with per-user memory
- deliveryman-demo - Interactive delivery agent with memory-based navigation
- hindsight-litellm-demo - Side-by-side comparison of memory approaches
- hindsight-tool-learning-demo - Learning tool selection through memory
- openai-fitness-coach - Fitness coach with OpenAI Agents and Hindsight memory
- sanity-blog-memory - Syncing Sanity CMS content to Hindsight
- stancetracker - Track political stances using AI-powered memory
Interactive Jupyter notebooks demonstrating Hindsight features:
- 01-quickstart.ipynb - Basic operations: retain, recall, and reflect
- 02-per-user-memory.ipynb - Pattern for per-user memory banks
- 03-support-agent-shared-knowledge.ipynb - Multi-bank architecture for support agents
- 04-litellm-memory-demo.ipynb - Automatic memory with LiteLLM callbacks
- 05-tool-learning-demo.ipynb - Learning tool selection through memory
- fitness_tracker.ipynb - Fitness coach with workout and diet memory
- healthcare_assistant.ipynb - Health chatbot demo
- movie_recommendation.ipynb - Personalized movie recommender
- personal_assistant.ipynb - General-purpose assistant with long-term memory
- personalized_search.ipynb - Context-aware search agent
- study_buddy.ipynb - Study assistant with spaced repetition
Start Hindsight using Docker:
export OPENAI_API_KEY=your-key
docker run --rm -it --pull always -p 8888:8888 -p 9999:9999 \
-e HINDSIGHT_API_LLM_API_KEY=$OPENAI_API_KEY \
-e HINDSIGHT_API_LLM_MODEL=o3-mini \
-v $HOME/.hindsight-docker:/home/hindsight/.pg0 \
ghcr.io/vectorize-io/hindsight:latestSee Hindsight documentation for other LLM providers.
Each notebook can be run independently. Install dependencies:
cd notebooks
pip install -r requirements.txt
jupyter notebookEach application has its own setup instructions in its README.
Contributions are welcome! Please open an issue or submit a pull request.
MIT