Skip to content

Conversation

@sonic182
Copy link

@sonic182 sonic182 commented Nov 3, 2025

What is this Python project?

llm_async is an async-first, lightweight Python library for interacting with modern Large Language Model (LLM) providers such as OpenAI, Google Gemini, Anthropic Claude, and OpenRouter.

It provides a unified async API for chat completions, streaming responses, tool/agent execution, and JSON-schema-validated structured outputs — all built with asyncio and designed for production integration.

Key Features

  • Async-first design — built entirely around asyncio, no blocking I/O
  • Unified provider interface — same API for OpenAI, Gemini, Claude, and OpenRouter
  • Automatic tool execution — define tools once and use them across providers
  • Pub/Sub events — real-time event emission for tool execution (start/complete/error)
  • Structured outputs — enforce JSON Schema validation across supported models
  • Extensible architecture — easily add new providers by inheriting from BaseProvider
  • Streaming support — async iterator interface for live model responses

GitHub: https://github.com/sonic182/llm-async


What's the difference between this Python project and similar ones?

  • Unlike synchronous SDKs (e.g. openai, anthropic), llm_async is async-first, not a wrapper.
  • Unlike high-level frameworks (e.g. LangChain, LlamaIndex), it’s minimal and provider-agnostic — focused on clean async primitives rather than orchestration layers.
  • Supports tool calling + structured outputs + streaming under one unified API surface.
  • Designed for developers who want low-level control and high throughput without extra dependencies.

Anyone who agrees with this pull request could submit an Approve review to it.

Copy link

@YuzeHao2023 YuzeHao2023 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants