A solution for real-time streaming output from Temporal workflows using Redis PubSub. This addresses the limitation that Temporal doesn't natively support streaming responses from an LLM.
This project demonstrates how to implement real-time streaming from Temporal workflows by using Redis PubSub as a communication channel. This demonstration focuses specifically on streaming LLM output back to the user via CLI, while also persisting results to the Temporal Activity.
- A Temporal workflow is defined that executes an activity
- The activity streams its output to a Redis PubSub channel character by character
- A client subscribes to the channel and displays the stream in real-time in the terminal
- The workflow completes once all streaming is finished
- Anthropic API key
- uv [Python project manager]
- Redis server (e.g.
brew install redis) - Temporal server
uv sync- Start a Redis server in a separate terminal:
redis-server # Validate it's running in another terminal by running 'redi-cli ping' --> response should be 'pong'- Start a Temporal server in a separate terminal:
temporal server start-dev # Open the Temporal UI on localhost:8233- Start the Worker from the original virtual environment:
uv run python -m worker- Start the Workflow from a new terminal with the virtual environment
uv run python -m starterContributions are welcome! Please feel free to submit a Pull Request.