- 
                Notifications
    You must be signed in to change notification settings 
- Fork 748
Add async cache support #6736
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add async cache support #6736
Conversation
Extract common execution logic into _prepare_call_execution() and _finalize_cache_update() helper methods. This reduces duplication and prepares the codebase for async cache support. - Add _prepare_call_execution() to build execution context - Add _finalize_cache_update() to save cache results - Refactor __call__() to use new helpers
Add support for async/await functions with @cache, @lru_cache, and @persistent_cache decorators. Implements task deduplication to prevent race conditions when multiple concurrent calls are made with the same arguments. Implementation: - Use type(self) instead of _cache_call() in __get__() for proper subclass dispatch - Detect async functions and dispatch to _cache_call_async variant - Implement task deduplication using asyncio.Task caching with WeakKeyDictionary - Prevent concurrent duplicate executions via _pending_executions dict - Release lock before awaiting tasks to avoid deadlocks Testing: - Add 15 comprehensive async cache tests - Test concurrent deduplication (5 concurrent calls → 1 execution) - All 115 tests passing (100 sync + 15 async)
Add documentation for async/await support in cache decorators: - Add async examples for @cache and @persistent_cache decorators - Document task deduplication behavior for concurrent async calls - Update comparison table to show async support advantage over functools.cache
| All contributors have signed the CLA  ✍️ ✅ | 
| The latest updates on your projects. Learn more about Vercel for GitHub. 
 | 
| super cool! thanks for the PR. we will give this a review this week. | 
| I have read the CLA Document and I hereby sign the CLA | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Comments are non-blocking. This looks great! Thanks!
| try: | ||
| if attempt.hit: | ||
| attempt.restore(scope) | ||
| return attempt.meta["return"] | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| return attempt.meta["return"] | |
| return attempt.meta.get("return") | 
| | Tracks closed-over variables | ✅ | ❌ | | ||
| | Allows unhashable arguments? | ✅ | ❌ | | ||
| | Allows Array-like arguments? | ✅ | ❌ | | ||
| | Supports async functions? | ✅ | ❌ | | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🥳
| don't worry about the failing playwright tests (that is now fixed on main) | 
| Thanks for the quick merge! | 
📝 Summary
Adds full async/await support to marimo's cache decorators (
mo.cache,mo.lru_cache,mo.persistent_cache) with automatic taskdeduplication to prevent race conditions and duplicate work.
🔍 Description of Changes
What's New
All marimo cache decorators now work seamlessly with both synchronous and asynchronous functions:
Key Features
Task Deduplication: When multiple concurrent calls are made to a cached async function with the same arguments, only one execution occurs—the rest await the result. This prevents race conditions and duplicate work.
Implementation Details
Commit 1: Refactoring (cf47fb0)
Commit 2: Async Support (e250061)
Commit 3: Documentation (52b6e2c).
Technical Decisions
keep them alive while being awaited.
since sync code doesn't have concurrent execution issues.
Testing
All existing tests continue to pass (100 sync cache tests), plus 15 new async cache tests covering:
📋 Checklist