diff --git a/docs/plans/2026-02-28-gemini-google-genai-design.md b/docs/plans/2026-02-28-gemini-google-genai-design.md new file mode 100644 index 0000000000..5053b2c89e --- /dev/null +++ b/docs/plans/2026-02-28-gemini-google-genai-design.md @@ -0,0 +1,207 @@ +# Gemini/Google via GenAI SDK Design + +## Background + +Current Gemini/Google traffic goes through `openai_compat` HTTP calls. This creates two issues: + +1. Gemini-specific behavior is coupled to OpenAI-compatible wire format branches. +2. Gemini thought-signature handling depends on dialect-specific JSON mapping in HTTP code. + +We already accepted a hybrid provider strategy in this branch: + +- OpenAI protocol uses official OpenAI SDK. +- Other protocols stay on their existing adapter. + +This document applies the same strategy to Gemini-family protocols: + +- Route `gemini/*` and `google/*` to `google.golang.org/genai`. +- Keep `antigravity/*` unchanged. + +## Goals + +- Move Gemini/Google transport to official `genai` SDK. +- Keep `providers.LLMProvider.Chat` output contract unchanged. +- Preserve thought-signature set/use compatibility across runtime and session history. +- Minimize regression risk by isolating routing changes. + +## Non-Goals + +- Migrating `antigravity/*` to `genai`. +- Refactoring all provider-selection legacy paths in one pass. +- Changing agent/session schema. + +## Why `antigravity/*` Stays Separate + +`antigravity` uses Cloud Code Assist private endpoints (`cloudcode-pa.googleapis.com/v1internal:*`) with custom envelope fields (`project`, `requestType`, `requestId`, etc.). This is not the normal Gemini API surface used by `genai`. + +Conclusion: `antigravity/*` remains on its dedicated provider. + +## Candidate Approaches + +### Option 1 (Recommended): New `gemini_sdk` provider, protocol routing split + +- Add `pkg/providers/gemini_sdk`. +- Route `gemini` and `google` protocols to this provider. +- Keep `openai_compat` for other OpenAI-compatible protocols. +- Remove only Gemini-specific request-branch logic from `openai_compat`. + +Pros: +- Clear boundaries and low coupling. +- Easy to test and roll back. +- Matches existing OpenAI SDK migration pattern. + +Cons: +- One extra provider package. + +### Option 2: Keep `openai_compat`, internally branch into `genai` + +Pros: +- Fewer top-level provider packages. + +Cons: +- `openai_compat` grows in complexity and mixed responsibilities. +- Harder long-term maintenance. + +### Option 3: One-shot migrate `gemini/google/antigravity` + +Pros: +- Superficial unification. + +Cons: +- Highest risk. +- `antigravity` protocol mismatch makes this brittle. + +## Decision + +Adopt Option 1. + +## Target Architecture + +### New Provider + +Create `pkg/providers/gemini_sdk/provider.go` implementing `providers.LLMProvider` with `genai.Client`. + +Provider constructor inputs: + +- `apiKey` +- `apiBase` (optional override) +- `proxy` +- `requestTimeout` + +### Factory Routing (`model_list` path) + +In `CreateProviderFromConfig`: + +- `case "gemini", "google"` => `gemini_sdk.NewProvider(...)` +- `case "antigravity"` => unchanged. +- Other protocol routing unchanged. + +## Request Mapping + +`providers.Message` -> `[]*genai.Content` + `GenerateContentConfig`: + +- `system` -> `config.SystemInstruction` +- `user` text -> user text part +- `assistant` text -> model text part +- `assistant` tool calls -> model `FunctionCall` parts +- `tool`/tool result -> user `FunctionResponse` parts + +Tool definitions: + +- map to `Tool.FunctionDeclarations`. + +Options: + +- `max_tokens` -> `MaxOutputTokens` +- `temperature` -> `Temperature` +- `prompt_cache_key` ignored for Gemini (consistent with current behavior) + +## Response Mapping + +From `GenerateContentResponse` first candidate: + +- `LLMResponse.Content` <- `resp.Text()` +- `LLMResponse.ToolCalls` <- `part.FunctionCall` entries +- `LLMResponse.Usage` <- `UsageMetadata` counts +- `LLMResponse.FinishReason` mapping: + - tool calls present -> `tool_calls` + - `MAX_TOKENS` -> `length` + - else -> `stop` + +## ThoughtSignature Compatibility (Set + Use) + +### Source field in SDK + +- `genai.Part.ThoughtSignature` (`[]byte`) alongside `Part.FunctionCall`. + +### Write path (set) + +When parsing response function calls: + +- Store signature into `ToolCall.ExtraContent.Google.ThoughtSignature`. +- Mirror same value into `ToolCall.Function.ThoughtSignature` for backward compatibility. + +### Read path (use) + +When rebuilding assistant tool-call history for the next SDK request: + +1. Read `ToolCall.ExtraContent.Google.ThoughtSignature` (preferred). +2. Fallback to `ToolCall.Function.ThoughtSignature`. +3. If missing/invalid, continue without signature. + +This guarantees compatibility across mixed old/new session data. + +## Session Serialization/Deserialization Compatibility + +Session persistence serializes `providers.Message` as JSON. + +Key compatibility facts: + +- `ToolCall.ThoughtSignature` is non-serialized (`json:"-"`). +- Serialized fields are `Function.ThoughtSignature` and `ExtraContent.Google.ThoughtSignature`. + +Compatibility strategy: + +- New code writes both serialized fields. +- New code reads both (priority: `extra_content` then `function`). +- Old session files (function-only) remain usable. +- New session files remain usable by old readers through mirrored function field. + +## `openai_compat` Cleanup After Migration + +### Safe to remove in Phase A + +- Gemini host-based `prompt_cache_key` request suppression. +- Gemini/Google request-side model-prefix special casing tied to Gemini routing. + +### Keep in Phase A + +- Generic response parsing support for `extra_content.google.thought_signature`. + +Rationale: this may still appear from non-gemini OpenAI-compatible gateways. + +## Testing Plan (Design-Level) + +1. Provider routing tests: + - `gemini/*` -> `*gemini_sdk.Provider` + - `google/*` -> `*gemini_sdk.Provider` + - `antigravity/*` unchanged +2. Mapping tests: + - message roles, tool declarations, max tokens, temperature +3. Thought-signature tests: + - response signature -> extra_content + function mirror + - history rebuild prefers extra_content, falls back to function +4. Session compatibility tests: + - old session payload replays correctly + - new payload round-trips via JSON +5. Regression: + - providers package tests + - full `go test ./...` + +## Acceptance Criteria + +- Gemini/Google protocols use `genai` SDK provider. +- `Provider.Chat` external behavior remains stable. +- ThoughtSignature set/use is compatible across old/new sessions. +- `antigravity` behavior unchanged. +- Full test suite passes. diff --git a/docs/plans/2026-02-28-openai-sdk-for-openai-protocol-design.md b/docs/plans/2026-02-28-openai-sdk-for-openai-protocol-design.md new file mode 100644 index 0000000000..e65938055a --- /dev/null +++ b/docs/plans/2026-02-28-openai-sdk-for-openai-protocol-design.md @@ -0,0 +1,146 @@ +# OpenAI SDK for OpenAI Protocol Design + +## Background + +The project currently uses: + +- `codex_provider` with `github.com/openai/openai-go/v3` for Codex-specific backend. +- `openai_compat` with manual HTTP JSON for OpenAI-compatible multi-provider support. + +`openai_compat` is intentionally broad and handles provider dialect differences. For protocol `openai`, we can use the official SDK with lower integration risk if we isolate it to OpenAI-only path. + +## Decision + +Adopt hybrid routing: + +1. `openai` protocol (API key path) uses a new SDK-backed provider. +2. Other OpenAI-compatible protocols continue using existing `openai_compat` HTTP provider. +3. `openai` oauth/token paths remain on existing Codex provider path. + +## Goals + +- Improve OpenAI protocol correctness/maintainability via official SDK. +- Avoid destabilizing non-OpenAI compatible providers. +- Preserve existing external provider interface and agent loop behavior. + +## Non-Goals + +- Full migration of all OpenAI-compatible providers to SDK. +- Removing `openai_compat`. +- Mapping non-existent SDK fields into `ReasoningContent`, `ReasoningDetails`, `ThoughtSignature` for OpenAI protocol. + +## Architecture + +### New Provider + +Create `pkg/providers/openai_sdk/provider.go` implementing `providers.LLMProvider`. + +Core construction inputs: + +- `apiKey` +- `apiBase` +- `proxy` +- `requestTimeout` +- `maxTokensField` + +Implementation uses: + +- `openai.NewClient(...)` +- `option.WithBaseURL(...)` +- `option.WithAPIKey(...)` +- `option.WithHTTPClient(...)` + +### Factory Routing + +Update `CreateProviderFromConfig`: + +- `protocol == openai` + API key path => `OpenAISDKProvider` +- `protocol == openai` + oauth/token => existing codex auth provider (unchanged) +- all other openai-compatible protocols => existing `HTTPProvider` (`openai_compat`) + +## Data Mapping + +### Request Mapping + +`providers.Message` -> `openai.ChatCompletionMessageParamUnion`: + +- `system`, `user`, `assistant`, `tool` roles +- assistant tool calls mapped where needed + +`providers.ToolDefinition` -> SDK function tools list. + +Options mapping: + +- `max_tokens` -> `MaxTokens` or `MaxCompletionTokens` (respect `maxTokensField`) +- `temperature` -> `Temperature` +- `prompt_cache_key` -> `PromptCacheKey` (OpenAI path only) + +### Response Mapping + +From first choice: + +- `Content` +- `ToolCalls` +- `FinishReason` +- `Usage` + +OpenAI SDK path intentionally does not map: + +- `ReasoningContent` +- `ReasoningDetails` +- `ThoughtSignature` + +These fields remain available for dialect providers on `openai_compat` path. + +## Error Handling + +- Surface SDK errors with status/type/code where available. +- Preserve current provider error semantics as much as possible (human-readable failure context). +- Keep timeout/proxy failures actionable. + +## Testing Strategy + +### Unit Tests (`openai_sdk/provider_test.go`) + +- Basic content response parsing. +- Tool call response parsing. +- Max token field routing (`max_tokens` vs `max_completion_tokens`). +- Prompt cache key inclusion. +- Timeout behavior. +- Proxy behavior. + +### Factory Tests + +- OpenAI API-key config returns `*OpenAISDKProvider`. +- OpenAI oauth/token continues existing path. +- Non-openai protocols still return `*HTTPProvider`. + +### Regression + +- Existing `openai_compat` tests remain green. +- Full test suite passes (`go test ./...`). + +## Risks and Mitigations + +1. Behavior drift between SDK and HTTP paths. + - Mitigation: focused parity tests for fields/options used by agent loop. + +2. Incomplete message/tool mapping edge cases. + - Mitigation: explicit role/tool test matrix and conservative fallback behavior. + +3. Future duplication between SDK and HTTP logic. + - Mitigation: keep SDK path narrow (OpenAI-only) and avoid over-abstracting in this iteration. + +## Rollout + +1. Introduce new provider with tests. +2. Route `openai` protocol API-key path in factory. +3. Run provider package and full suite tests. +4. Keep capability profile/override logic in `openai_compat` for non-openai protocols. + +## Acceptance Criteria + +- OpenAI protocol (API key path) no longer uses `openai_compat`. +- Non-openai protocols continue working on `openai_compat` unchanged. +- No regression in tool call flow and usage accounting. +- Test suite passes. diff --git a/docs/plans/2026-02-28-openai-sdk-for-openai-protocol-implementation.md b/docs/plans/2026-02-28-openai-sdk-for-openai-protocol-implementation.md new file mode 100644 index 0000000000..fc44ace292 --- /dev/null +++ b/docs/plans/2026-02-28-openai-sdk-for-openai-protocol-implementation.md @@ -0,0 +1,221 @@ +# OpenAI SDK for OpenAI Protocol Implementation Plan + +> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task. + +**Goal:** Route `openai` protocol (API-key path) to a new SDK-backed provider using `openai-go/v3`, while keeping all other OpenAI-compatible protocols on existing `openai_compat` HTTP provider. + +**Architecture:** Add a dedicated `openai_sdk` provider implementing `LLMProvider`, map shared message/tool/options into Chat Completions params, and update factory routing to select SDK only for OpenAI API-key path. Preserve codex oauth/token behavior and leave non-openai protocols untouched. + +**Tech Stack:** Go 1.25+, `github.com/openai/openai-go/v3`, `httptest`, table-driven tests. + +--- + +Use @superpowers/test-driven-development for each task and @superpowers/verification-before-completion before final handoff. + +### Task 1: Scaffold `openai_sdk` Provider With Minimal Failing Test + +**Files:** +- Create: `pkg/providers/openai_sdk/provider.go` +- Create: `pkg/providers/openai_sdk/provider_test.go` + +**Step 1: Write failing test for basic text response parsing** + +```go +func TestOpenAISDKProvider_Chat_BasicContent(t *testing.T) { + // httptest server returns minimal chat.completions JSON + // assert response.Content and FinishReason +} +``` + +**Step 2: Run test to verify it fails** + +Run: `go test ./pkg/providers/openai_sdk -run BasicContent -v` +Expected: FAIL (provider not implemented / compile errors). + +**Step 3: Implement minimal provider constructor + Chat skeleton** + +- Create client with `option.WithBaseURL`, `option.WithAPIKey`, `option.WithHTTPClient`. +- Call `client.Chat.Completions.New(...)` with minimal params. +- Parse first choice content + finish reason. + +**Step 4: Run test to verify it passes** + +Run: `go test ./pkg/providers/openai_sdk -run BasicContent -v` +Expected: PASS. + +**Step 5: Commit** + +```bash +git add pkg/providers/openai_sdk/provider.go pkg/providers/openai_sdk/provider_test.go +git commit -m "feat(providers): add openai sdk provider skeleton" +``` + +### Task 2: Add Message/Tool Mapping Coverage + +**Files:** +- Modify: `pkg/providers/openai_sdk/provider.go` +- Modify: `pkg/providers/openai_sdk/provider_test.go` + +**Step 1: Write failing tests for roles and tool calls** + +- user/system/assistant/tool message mapping. +- response tool_calls -> internal `providers.ToolCall` mapping. + +**Step 2: Run tests to verify they fail** + +Run: `go test ./pkg/providers/openai_sdk -run 'Message|ToolCall' -v` +Expected: FAIL on mapping assertions. + +**Step 3: Implement mapping helpers** + +- `buildChatMessages(...)` +- `buildChatTools(...)` +- `parseChoiceToolCalls(...)` + +**Step 4: Run tests to verify pass** + +Run: `go test ./pkg/providers/openai_sdk -run 'Message|ToolCall' -v` +Expected: PASS. + +**Step 5: Commit** + +```bash +git add pkg/providers/openai_sdk/provider.go pkg/providers/openai_sdk/provider_test.go +git commit -m "feat(openai-sdk): map messages and tool calls for chat completions" +``` + +### Task 3: Add Options Mapping (`max_tokens`, `temperature`, `prompt_cache_key`) + +**Files:** +- Modify: `pkg/providers/openai_sdk/provider.go` +- Modify: `pkg/providers/openai_sdk/provider_test.go` + +**Step 1: Write failing tests for options mapping** + +- `max_tokens` with default field -> `max_tokens`. +- `max_tokens` with config `max_completion_tokens`. +- `temperature` mapped. +- `prompt_cache_key` mapped for OpenAI path. + +**Step 2: Run tests to verify failure** + +Run: `go test ./pkg/providers/openai_sdk -run 'MaxTokens|Temperature|PromptCacheKey' -v` +Expected: FAIL. + +**Step 3: Implement options mapping** + +- Add `maxTokensField` handling in provider config. +- Apply SDK params fields accordingly. + +**Step 4: Run tests to verify pass** + +Run: `go test ./pkg/providers/openai_sdk -run 'MaxTokens|Temperature|PromptCacheKey' -v` +Expected: PASS. + +**Step 5: Commit** + +```bash +git add pkg/providers/openai_sdk/provider.go pkg/providers/openai_sdk/provider_test.go +git commit -m "feat(openai-sdk): map max tokens, temperature, and prompt cache key" +``` + +### Task 4: Add Timeout/Proxy and Error-Path Coverage + +**Files:** +- Modify: `pkg/providers/openai_sdk/provider.go` +- Modify: `pkg/providers/openai_sdk/provider_test.go` + +**Step 1: Write failing tests** + +- Request timeout using slow test server. +- Proxy transport setup verification. +- Non-200 response error formatting. + +**Step 2: Run tests to verify failure** + +Run: `go test ./pkg/providers/openai_sdk -run 'Timeout|Proxy|HTTPError' -v` +Expected: FAIL. + +**Step 3: Implement HTTP client construction and robust errors** + +- Build `http.Client` with timeout + optional proxy transport. +- Surface SDK/API errors with actionable messages. + +**Step 4: Run tests to verify pass** + +Run: `go test ./pkg/providers/openai_sdk -run 'Timeout|Proxy|HTTPError' -v` +Expected: PASS. + +**Step 5: Commit** + +```bash +git add pkg/providers/openai_sdk/provider.go pkg/providers/openai_sdk/provider_test.go +git commit -m "feat(openai-sdk): support proxy, timeout, and error handling" +``` + +### Task 5: Wire Factory Routing for `openai` API-Key Path + +**Files:** +- Modify: `pkg/providers/factory_provider.go` +- Modify: `pkg/providers/factory_provider_test.go` +- Optionally modify: `pkg/providers/http_provider.go` (if helper reuse is needed) + +**Step 1: Write failing factory tests** + +- `openai` + API key => returns `*OpenAISDKProvider`. +- `openai` + oauth/token => existing codex provider path unchanged. +- non-openai protocols => existing `*HTTPProvider` path unchanged. + +**Step 2: Run tests to verify failure** + +Run: `go test ./pkg/providers -run 'CreateProviderFromConfig_.*OpenAI.*' -v` +Expected: FAIL until routing updated. + +**Step 3: Implement routing logic** + +- In `case "openai":` branch, keep auth-method split. +- API-key HTTP path creates SDK provider instead of HTTPProvider. + +**Step 4: Run tests to verify pass** + +Run: `go test ./pkg/providers -run 'CreateProviderFromConfig_.*OpenAI.*' -v` +Expected: PASS. + +**Step 5: Commit** + +```bash +git add pkg/providers/factory_provider.go pkg/providers/factory_provider_test.go +git commit -m "refactor(factory): route openai api-key protocol to sdk provider" +``` + +### Task 6: Full Verification and Regression Scan + +**Files:** +- Modify (if needed): tests/docs only + +**Step 1: Run focused package tests** + +Run: `go test ./pkg/providers/openai_sdk ./pkg/providers/...` +Expected: PASS. + +**Step 2: Run full test suite** + +Run: `go test ./...` +Expected: PASS. + +**Step 3: Verify routing boundaries** + +Run: `rg -n "case \"openai\"|openai_sdk|openai_compat" pkg/providers/factory_provider.go pkg/providers` +Expected: openai API-key path points to sdk provider; other protocols remain on openai_compat. + +**Step 4: Inspect changes and ensure clean state** + +Run: `git status --short` +Expected: clean. + +**Step 5: Commit any final test/docs adjustments** + +```bash +git add -A +git commit -m "test/providers: finalize openai sdk routing regression coverage" +``` diff --git a/go.mod b/go.mod index 7892cade68..80b1dcd114 100644 --- a/go.mod +++ b/go.mod @@ -8,6 +8,7 @@ require ( github.com/bwmarrin/discordgo v0.29.0 github.com/caarlos0/env/v11 v11.3.1 github.com/chzyer/readline v1.5.1 + github.com/gdamore/tcell/v2 v2.13.8 github.com/google/uuid v1.6.0 github.com/gorilla/websocket v1.5.3 github.com/larksuite/oapi-sdk-go/v3 v3.5.3 @@ -15,6 +16,7 @@ require ( github.com/mymmrac/telego v1.6.0 github.com/open-dingtalk/dingtalk-stream-sdk-go v0.9.1 github.com/openai/openai-go/v3 v3.22.0 + github.com/rivo/tview v0.42.0 github.com/slack-go/slack v0.17.3 github.com/spf13/cobra v1.10.2 github.com/stretchr/testify v1.11.1 @@ -22,11 +24,15 @@ require ( go.mau.fi/whatsmeow v0.0.0-20260219150138-7ae702b1eed4 golang.org/x/oauth2 v0.35.0 golang.org/x/time v0.14.0 + google.golang.org/genai v1.48.0 google.golang.org/protobuf v1.36.11 modernc.org/sqlite v1.46.1 ) require ( + cloud.google.com/go v0.116.0 // indirect + cloud.google.com/go/auth v0.9.3 // indirect + cloud.google.com/go/compute/metadata v0.5.0 // indirect filippo.io/edwards25519 v1.1.0 // indirect github.com/beeper/argo-go v1.1.2 // indirect github.com/coder/websocket v1.8.14 // indirect @@ -34,7 +40,10 @@ require ( github.com/dustin/go-humanize v1.0.1 // indirect github.com/elliotchance/orderedmap/v3 v3.1.0 // indirect github.com/gdamore/encoding v1.0.1 // indirect - github.com/gdamore/tcell/v2 v2.13.8 // indirect + github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da // indirect + github.com/google/go-cmp v0.7.0 // indirect + github.com/google/s2a-go v0.1.8 // indirect + github.com/googleapis/enterprise-certificate-proxy v0.3.4 // indirect github.com/inconshreveable/mousetrap v1.1.0 // indirect github.com/lucasb-eyer/go-colorful v1.3.0 // indirect github.com/mattn/go-colorable v0.1.14 // indirect @@ -43,16 +52,18 @@ require ( github.com/petermattis/goid v0.0.0-20260113132338-7c7de50cc741 // indirect github.com/pmezard/go-difflib v1.0.0 // indirect github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec // indirect - github.com/rivo/tview v0.42.0 // indirect github.com/rivo/uniseg v0.4.7 // indirect github.com/rs/zerolog v1.34.0 // indirect github.com/spf13/pflag v1.0.10 // indirect github.com/vektah/gqlparser/v2 v2.5.27 // indirect go.mau.fi/libsignal v0.2.1 // indirect go.mau.fi/util v0.9.6 // indirect + go.opencensus.io v0.24.0 // indirect golang.org/x/exp v0.0.0-20260212183809-81e46e3db34a // indirect golang.org/x/term v0.40.0 // indirect golang.org/x/text v0.34.0 // indirect + google.golang.org/genproto/googleapis/rpc v0.0.0-20240903143218-8af14fe29dc1 // indirect + google.golang.org/grpc v1.66.2 // indirect gopkg.in/yaml.v3 v3.0.1 // indirect modernc.org/libc v1.67.6 // indirect modernc.org/mathutil v1.7.1 // indirect diff --git a/go.sum b/go.sum index d1ee1d6298..d3ea32ceb1 100644 --- a/go.sum +++ b/go.sum @@ -1,6 +1,14 @@ +cloud.google.com/go v0.26.0/go.mod h1:aQUYkXzVsufM+DwF1aE+0xfcU+56JwCaLick0ClmMTw= +cloud.google.com/go v0.116.0 h1:B3fRrSDkLRt5qSHWe40ERJvhvnQwdZiHu0bJOpldweE= +cloud.google.com/go v0.116.0/go.mod h1:cEPSRWPzZEswwdr9BxE6ChEn01dWlTaF05LiC2Xs70U= +cloud.google.com/go/auth v0.9.3 h1:VOEUIAADkkLtyfr3BLa3R8Ed/j6w1jTBmARx+wb5w5U= +cloud.google.com/go/auth v0.9.3/go.mod h1:7z6VY+7h3KUdRov5F1i8NDP5ZzWKYmEPO842BgCsmTk= cloud.google.com/go/compute/metadata v0.3.0/go.mod h1:zFmK7XCadkQkj6TtorcaGlCW1hT1fIilQDwofLpJ20k= +cloud.google.com/go/compute/metadata v0.5.0 h1:Zr0eK8JbFv6+Wi4ilXAR8FJ3wyNdpxHKJNPos6LTZOY= +cloud.google.com/go/compute/metadata v0.5.0/go.mod h1:aHnloV2TPI38yx4s9+wAZhHykWvVCfu7hQbF+9CWoiY= filippo.io/edwards25519 v1.1.0 h1:FNf4tywRC1HmFuKW5xopWpigGjJKiJSV0Cqo0cJWDaA= filippo.io/edwards25519 v1.1.0/go.mod h1:BxyFTGdWcka3PhytdK4V28tE5sGfRvvvRV7EaN4VDT4= +github.com/BurntSushi/toml v0.3.1/go.mod h1:xHWCNGjB5oqiDr8zfno3MHue2Ht5sIBksp03qcyfWMU= github.com/DATA-DOG/go-sqlmock v1.5.2 h1:OcvFkGmslmlZibjAjaHm3L//6LiuBgolP7OputlJIzU= github.com/DATA-DOG/go-sqlmock v1.5.2/go.mod h1:88MAG/4G7SMwSE3CeA0ZKzrT5CiOU3OJ+JlNzwDqpNU= github.com/adhocore/gronx v1.19.6 h1:5KNVcoR9ACgL9HhEqCm5QXsab/gI4QDIybTAWcXDKDc= @@ -25,6 +33,7 @@ github.com/bytedance/sonic/loader v0.5.0 h1:gXH3KVnatgY7loH5/TkeVyXPfESoqSBSBEiD github.com/bytedance/sonic/loader v0.5.0/go.mod h1:AR4NYCk5DdzZizZ5djGqQ92eEhCCcdf5x77udYiSJRo= github.com/caarlos0/env/v11 v11.3.1 h1:cArPWC15hWmEt+gWk7YBi7lEXTXCvpaSdCiZE2X5mCA= github.com/caarlos0/env/v11 v11.3.1/go.mod h1:qupehSf/Y0TUTsxKywqRt/vJjN5nz6vauiYEUUr8P4U= +github.com/census-instrumentation/opencensus-proto v0.2.1/go.mod h1:f6KPmirojxKA12rnyqOA5BBL4O983OfeGPqjHWSTneU= github.com/cespare/xxhash/v2 v2.1.2/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs= github.com/cespare/xxhash/v2 v2.2.0/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs= github.com/chzyer/logex v1.2.1 h1:XHDu3E6q+gdHgsdTPH6ImJMIp436vR6MPtH8gP05QzM= @@ -33,8 +42,10 @@ github.com/chzyer/readline v1.5.1 h1:upd/6fQk4src78LMRzh5vItIt361/o4uq553V8B5sGI github.com/chzyer/readline v1.5.1/go.mod h1:Eh+b79XXUwfKfcPLepksvw2tcLE/Ct21YObkaSkeBlk= github.com/chzyer/test v1.0.0 h1:p3BQDXSxOhOG0P9z6/hGnII4LGiEPOYBhs8asl/fC04= github.com/chzyer/test v1.0.0/go.mod h1:2JlltgoNkt4TW/z9V/IzDdFaMTM2JPIi26O1pF38GC8= +github.com/client9/misspell v0.3.4/go.mod h1:qj6jICC3Q7zFZvVWo7KLAzC3yx5G7kyvSDkc90ppPyw= github.com/cloudwego/base64x v0.1.6 h1:t11wG9AECkCDk5fMSoxmufanudBtJ+/HemLstXDLI2M= github.com/cloudwego/base64x v0.1.6/go.mod h1:OFcloc187FXDaYHvrNIjxSe8ncn0OOM8gEHfghB2IPU= +github.com/cncf/udpa/go v0.0.0-20191209042840-269d4d468f6f/go.mod h1:M8M6+tZqaGXZJjfX53e64911xZQV5JYwmTeXPW+k8Sc= github.com/coder/websocket v1.8.14 h1:9L0p0iKiNOibykf283eHkKUHHrpG7f65OE3BhhO7v9g= github.com/coder/websocket v1.8.14/go.mod h1:NX3SzP+inril6yawo5CQXx8+fk145lPDC6pumgx0mVg= github.com/coreos/go-systemd/v22 v22.5.0/go.mod h1:Y58oyj3AT4RCenI/lSvhwexgC+NSVTIJ3seZv2GcEnc= @@ -48,6 +59,10 @@ github.com/dustin/go-humanize v1.0.1 h1:GzkhY7T5VNhEkwH0PVJgjz+fX1rhBrR7pRT3mDkp github.com/dustin/go-humanize v1.0.1/go.mod h1:Mu1zIs6XwVuF/gI1OepvI0qD18qycQx+mFykh5fBlto= github.com/elliotchance/orderedmap/v3 v3.1.0 h1:j4DJ5ObEmMBt/lcwIecKcoRxIQUEnw0L804lXYDt/pg= github.com/elliotchance/orderedmap/v3 v3.1.0/go.mod h1:G+Hc2RwaZvJMcS4JpGCOyViCnGeKf0bTYCGTO4uhjSo= +github.com/envoyproxy/go-control-plane v0.9.0/go.mod h1:YTl/9mNaCwkRvm6d1a2C3ymFceY/DCBVvsKhRF0iEA4= +github.com/envoyproxy/go-control-plane v0.9.1-0.20191026205805-5f8ba28d4473/go.mod h1:YTl/9mNaCwkRvm6d1a2C3ymFceY/DCBVvsKhRF0iEA4= +github.com/envoyproxy/go-control-plane v0.9.4/go.mod h1:6rpuAdCZL397s3pYoYcLgu1mIlRU8Am5FuJP05cCM98= +github.com/envoyproxy/protoc-gen-validate v0.1.0/go.mod h1:iSmxcyjqTsJpI2R4NaDN7+kN2VEUnK/pcBlmesArF7c= github.com/fsnotify/fsnotify v1.4.7/go.mod h1:jwhsz4b93w/PPRr/qN1Yymfu8t87LnFCMoQvtojpjFo= github.com/fsnotify/fsnotify v1.4.9/go.mod h1:znqG4EE+3YCdAaPaxE2ZRY/06pZUdp0tY4IgpuI1SZQ= github.com/gdamore/encoding v1.0.1 h1:YzKZckdBL6jVt2Gc+5p82qhrGiqMdG/eNs6Wy0u3Uhw= @@ -66,18 +81,29 @@ github.com/go-test/deep v1.1.1/go.mod h1:5C2ZWiW0ErCdrYzpqxLbTX7MG14M9iiw8DgHncV github.com/godbus/dbus/v5 v5.0.4/go.mod h1:xhWf0FNVPg57R7Z0UbKHbJfkEywrmjJnf7w5xrFpKfA= github.com/gogo/protobuf v1.3.2 h1:Ov1cvc58UF3b5XjBnZv7+opcTcQFZebYjWzi34vdm4Q= github.com/gogo/protobuf v1.3.2/go.mod h1:P1XiOD3dCwIKUDQYPy72D8LYyHL2YPYrpS2s69NZV8Q= +github.com/golang/glog v0.0.0-20160126235308-23def4e6c14b/go.mod h1:SBH7ygxi8pfUlaOkMMuAQtPIUF8ecWP5IEl/CR7VP2Q= +github.com/golang/groupcache v0.0.0-20200121045136-8c9f03a8e57e/go.mod h1:cIg4eruTrX1D+g88fzRXU5OdNfaM+9IcxsU14FzY7Hc= +github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da h1:oI5xCqsCo564l8iNU+DwB5epxmsaqB+rhGL0m5jtYqE= +github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da/go.mod h1:cIg4eruTrX1D+g88fzRXU5OdNfaM+9IcxsU14FzY7Hc= +github.com/golang/mock v1.1.1/go.mod h1:oTYuIxOrZwtPieC+H1uAHpcLFnEyAGVDL/k47Jfbm0A= github.com/golang/protobuf v1.2.0/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U= +github.com/golang/protobuf v1.3.2/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U= github.com/golang/protobuf v1.4.0-rc.1/go.mod h1:ceaxUfeHdC40wWswd/P6IGgMaK3YpKi5j83Wpe3EHw8= github.com/golang/protobuf v1.4.0-rc.1.0.20200221234624-67d41d38c208/go.mod h1:xKAWHe0F5eneWXFV3EuXVDTCmh+JuBKY0li0aMyXATA= github.com/golang/protobuf v1.4.0-rc.2/go.mod h1:LlEzMj4AhA7rCAGe4KMBDvJI+AwstrUpVNzEA03Pprs= github.com/golang/protobuf v1.4.0-rc.4.0.20200313231945-b860323f09d0/go.mod h1:WU3c8KckQ9AFe+yFwt9sWVRKCVIyN9cPHBJSNnbL67w= github.com/golang/protobuf v1.4.0/go.mod h1:jodUvKwWbYaEsadDk5Fwe5c77LiNKVO9IDvqG2KuDX0= +github.com/golang/protobuf v1.4.1/go.mod h1:U8fpvMrcmy5pZrNK1lt4xCsGvpyWQ/VVv6QDs8UjoX8= github.com/golang/protobuf v1.4.2/go.mod h1:oDoupMAO8OvCJWAcko0GGGIgR6R6ocIYbsSw735rRwI= +github.com/golang/protobuf v1.4.3/go.mod h1:oDoupMAO8OvCJWAcko0GGGIgR6R6ocIYbsSw735rRwI= github.com/golang/protobuf v1.5.0/go.mod h1:FsONVRAS9T7sI+LIUmWTfcYkHO4aIWwzhcaSAoJOfIk= github.com/golang/protobuf v1.5.2/go.mod h1:XVQd3VNwM+JqD3oG2Ue2ip4fOMUkwXdXDdiuN0vRsmY= +github.com/google/go-cmp v0.2.0/go.mod h1:oXzfMopK8JAjlY9xF4vHSVASa0yLyX7SntLO5aqRK0M= github.com/google/go-cmp v0.3.0/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU= github.com/google/go-cmp v0.3.1/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU= github.com/google/go-cmp v0.4.0/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE= +github.com/google/go-cmp v0.5.0/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE= +github.com/google/go-cmp v0.5.3/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE= github.com/google/go-cmp v0.5.5/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE= github.com/google/go-cmp v0.5.6/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE= github.com/google/go-cmp v0.5.9/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY= @@ -87,9 +113,14 @@ github.com/google/jsonschema-go v0.4.2 h1:tmrUohrwoLZZS/P3x7ex0WAVknEkBZM46iALbc github.com/google/jsonschema-go v0.4.2/go.mod h1:r5quNTdLOYEz95Ru18zA0ydNbBuYoo9tgaYcxEYhJVE= github.com/google/pprof v0.0.0-20250317173921-a4b03ec1a45e h1:ijClszYn+mADRFY17kjQEVQ1XRhq2/JR1M3sGqeJoxs= github.com/google/pprof v0.0.0-20250317173921-a4b03ec1a45e/go.mod h1:boTsfXsheKC2y+lKOCMpSfarhxDeIzfZG1jqGcPl3cA= +github.com/google/s2a-go v0.1.8 h1:zZDs9gcbt9ZPLV0ndSyQk6Kacx2g/X+SKYovpnz3SMM= +github.com/google/s2a-go v0.1.8/go.mod h1:6iNWHTpQ+nfNRN5E00MSdfDwVesa8hhS32PhPO8deJA= +github.com/google/uuid v1.1.2/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo= github.com/google/uuid v1.3.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo= github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0= github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo= +github.com/googleapis/enterprise-certificate-proxy v0.3.4 h1:XYIDZApgAnrN1c855gTgghdIA6Stxb52D5RnLI1SLyw= +github.com/googleapis/enterprise-certificate-proxy v0.3.4/go.mod h1:YKe7cfqYXjKGpGvmSg28/fFvhNzinZQm8DGnaburhGA= github.com/gorilla/websocket v1.4.2/go.mod h1:YR8l580nyteQvAITg2hZ9XVh4b55+EU/adAjf1fMHhE= github.com/gorilla/websocket v1.5.0/go.mod h1:YR8l580nyteQvAITg2hZ9XVh4b55+EU/adAjf1fMHhE= github.com/gorilla/websocket v1.5.3 h1:saDtZ6Pbx/0u+bgYQ3q96pZgCzfhKXGPqt7kZ72aNNg= @@ -152,6 +183,7 @@ github.com/pkg/diff v0.0.0-20210226163009-20ebb0f2a09e/go.mod h1:pJLUxLENpZxwdsK github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0= github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM= github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4= +github.com/prometheus/client_model v0.0.0-20190812154241-14fe0d1b01d4/go.mod h1:xMI15A0UPsDsEKsMN9yxemIoYk6Tm2C1GtYGdfGttqA= github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec h1:W09IVJc94icq4NjY3clb7Lk8O1qJ8BdBEF8z0ibU0rE= github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec/go.mod h1:qqbHyh8v60DhA7CoWK5oRCqLrMHRGoxYCSS9EjAz6Eo= github.com/rivo/tview v0.42.0 h1:b/ftp+RxtDsHSaynXTbJb+/n/BxDEi+W3UfF5jILK6c= @@ -221,6 +253,8 @@ go.mau.fi/util v0.9.6 h1:2nsvxm49KhI3wrFltr0+wSUBlnQ4CMtykuELjpIU+ts= go.mau.fi/util v0.9.6/go.mod h1:sIJpRH7Iy5Ad1SBuxQoatxtIeErgzxCtjd/2hCMkYMI= go.mau.fi/whatsmeow v0.0.0-20260219150138-7ae702b1eed4 h1:hsmlwsM+VqfF70cpdZEeIUKer2XWCQmQPK0u0tHy3ZQ= go.mau.fi/whatsmeow v0.0.0-20260219150138-7ae702b1eed4/go.mod h1:mXCRFyPEPn4jqWz6Afirn8vY7DpHCPnlKq6I2cWwFHM= +go.opencensus.io v0.24.0 h1:y73uSU6J157QMP2kn2r30vwW1A2W2WFwSCGnAVxeaD0= +go.opencensus.io v0.24.0/go.mod h1:vNK8G9p7aAivkbmorf4v+7Hgx+Zs0yY+0fOtgBfjQKo= go.uber.org/mock v0.6.0 h1:hyF9dfmbgIX5EfOdasqLsWD6xqpNZlXblLB/Dbnwv3Y= go.uber.org/mock v0.6.0/go.mod h1:KiVJ4BqZJaMj4svdfmHM0AUx4NJYO8ZNpPnZn1Z+BBU= go.yaml.in/yaml/v3 v3.0.4/go.mod h1:DhzuOOF2ATzADvBadXxruRBLzYTpT36CKvDb3+aBEFg= @@ -234,20 +268,29 @@ golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5y golang.org/x/crypto v0.16.0/go.mod h1:gCAAfMLgwOJRpTjQ2zCCt2OcSfYMTeZVSRtQlPC7Nq4= golang.org/x/crypto v0.48.0 h1:/VRzVqiRSggnhY7gNRxPauEQ5Drw9haKdM0jqfcCFts= golang.org/x/crypto v0.48.0/go.mod h1:r0kV5h3qnFPlQnBSrULhlsRfryS2pmewsg+XfMgkVos= +golang.org/x/exp v0.0.0-20190121172915-509febef88a4/go.mod h1:CJ0aWSM057203Lf6IL+f9T1iT9GByDxfZKAQTCR3kQA= golang.org/x/exp v0.0.0-20260212183809-81e46e3db34a h1:ovFr6Z0MNmU7nH8VaX5xqw+05ST2uO1exVfZPVqRC5o= golang.org/x/exp v0.0.0-20260212183809-81e46e3db34a/go.mod h1:K79w1Vqn7PoiZn+TkNpx3BUWUQksGO3JcVX6qIjytmA= +golang.org/x/lint v0.0.0-20181026193005-c67002cb31c3/go.mod h1:UVdnD1Gm6xHRNCYTkRU2/jEulfH38KcIWyp/GAMgvoE= +golang.org/x/lint v0.0.0-20190227174305-5b3e6a55c961/go.mod h1:wehouNa3lNwaWXcvxsM5YxQ5yQlVC4a0KAMCusXpPoU= +golang.org/x/lint v0.0.0-20190313153728-d0100b6bd8b3/go.mod h1:6SW0HCj/g11FgYtHlgUYUwCkIfeOF89ocIRzGO/8vkc= golang.org/x/mod v0.2.0/go.mod h1:s0Qsj1ACt9ePp/hMypM3fl4fZqREWJwdYDEqhRiZZUA= golang.org/x/mod v0.3.0/go.mod h1:s0Qsj1ACt9ePp/hMypM3fl4fZqREWJwdYDEqhRiZZUA= golang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91VN4djpZkiMVwK6gcyfeH4XE8wZrZaV4= golang.org/x/mod v0.8.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs= golang.org/x/mod v0.33.0 h1:tHFzIWbBifEmbwtGz65eaWyGiGZatSrT9prnU8DbVL8= golang.org/x/mod v0.33.0/go.mod h1:swjeQEj+6r7fODbD2cqrnje9PnziFuw4bmLbBZFrQ5w= +golang.org/x/net v0.0.0-20180724234803-3673e40ba225/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4= +golang.org/x/net v0.0.0-20180826012351-8a410e7b638d/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4= golang.org/x/net v0.0.0-20180906233101-161cd47e91fd/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4= +golang.org/x/net v0.0.0-20190213061140-3a22650c66bd/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4= +golang.org/x/net v0.0.0-20190311183353-d8887717615a/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg= golang.org/x/net v0.0.0-20190404232315-eb5bcb51f2a3/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg= golang.org/x/net v0.0.0-20190620200207-3b0461eec859/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s= golang.org/x/net v0.0.0-20200226121028-0de0cce0169b/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s= golang.org/x/net v0.0.0-20200520004742-59133d7f0dd7/go.mod h1:qpuaurCH72eLCgpAm/N6yyVIVM9cpaDIP3A8BGJEC5A= golang.org/x/net v0.0.0-20201021035429-f5854403a974/go.mod h1:sp8m0HH+o8qH0wwXwYZr8TS3Oi6o0r6Gce1SSxlDquU= +golang.org/x/net v0.0.0-20201110031124-69a78807bb2b/go.mod h1:sp8m0HH+o8qH0wwXwYZr8TS3Oi6o0r6Gce1SSxlDquU= golang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v0D8zg8gWTRqZa9RBIspLL5mdg= golang.org/x/net v0.0.0-20210405180319-a5a99cb37ef4/go.mod h1:p54w0d4576C0XHj96bSt6lcn1PtDYWL6XObtHCRCNQM= golang.org/x/net v0.0.0-20210428140749-89ef3d95e781/go.mod h1:OJAsFXCWl8Ukc7SiCT/9KSuxbyM7479/AVlXFRxuMCk= @@ -257,10 +300,12 @@ golang.org/x/net v0.10.0/go.mod h1:0qNGK6F8kojg2nk9dLZ2mShWaEBan6FAoqfSigmmuDg= golang.org/x/net v0.19.0/go.mod h1:CfAk/cbD4CthTvqiEl8NpboMuiuOYsAr/7NOjZJtv1U= golang.org/x/net v0.50.0 h1:ucWh9eiCGyDR3vtzso0WMQinm2Dnt8cFMuQa9K33J60= golang.org/x/net v0.50.0/go.mod h1:UgoSli3F/pBgdJBHCTc+tp3gmrU4XswgGRgtnwWTfyM= +golang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U= golang.org/x/oauth2 v0.23.0/go.mod h1:XYTD2NtWslqkgxebSiOHnXEap4TF09sJSc7H1sXbhtI= golang.org/x/oauth2 v0.35.0 h1:Mv2mzuHuZuY2+bkyWXIHMfhNdJAdwW3FuWeCPYN5GVQ= golang.org/x/oauth2 v0.35.0/go.mod h1:lzm5WQJQwKZ3nwavOZ3IS5Aulzxi68dUSgRHujetwEA= golang.org/x/sync v0.0.0-20180314180146-1d60e4601c6f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM= +golang.org/x/sync v0.0.0-20181108010431-42b317875d0f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM= golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM= golang.org/x/sync v0.0.0-20190911185100-cd5d95a43a6e/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM= golang.org/x/sync v0.0.0-20201020160332-67f06af15bc9/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM= @@ -268,6 +313,7 @@ golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJ golang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM= golang.org/x/sync v0.19.0 h1:vV+1eWNmZ5geRlYjzm2adRgW2/mcpevXNg50YZtPCE4= golang.org/x/sync v0.19.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI= +golang.org/x/sys v0.0.0-20180830151530-49385e6e1522/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY= golang.org/x/sys v0.0.0-20180909124046-d0be0721c37e/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY= golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY= golang.org/x/sys v0.0.0-20190412213103-97732733099d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs= @@ -311,6 +357,10 @@ golang.org/x/text v0.34.0/go.mod h1:homfLqTYRFyVYemLBFl5GgL/DWEiH5wcsQ5gSh1yziA= golang.org/x/time v0.14.0 h1:MRx4UaLrDotUKUdCIqzPC48t1Y9hANFKIRpNx+Te8PI= golang.org/x/time v0.14.0/go.mod h1:eL/Oa2bBBK0TkX57Fyni+NgnyQQN4LitPmob2Hjnqw4= golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ= +golang.org/x/tools v0.0.0-20190114222345-bf090417da8b/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ= +golang.org/x/tools v0.0.0-20190226205152-f727befe758c/go.mod h1:9Yl7xja0Znq3iFh3HoIrodX9oNMXvdceNzlUR8zjMvY= +golang.org/x/tools v0.0.0-20190311212946-11955173bddd/go.mod h1:LCzVGOaR6xXOjkQ3onu1FJEFr0SW1gC7cKk1uF8kGRs= +golang.org/x/tools v0.0.0-20190524140312-2c0ae7006135/go.mod h1:RgjU9mgBXZiqYHBnxXauZ1Gv1EHHAz9KjViQ78xBX0Q= golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo= golang.org/x/tools v0.0.0-20200619180055-7c47624df98f/go.mod h1:EkVYQZoAsY45+roYkvgYkIh4xh/qjgUK9TdY2XT94GE= golang.org/x/tools v0.0.0-20201224043029-2b0845dc783e/go.mod h1:emZCQorbCU4vsT4fOWvOPXz4eW1wZW4PmDk9uLelYpA= @@ -323,12 +373,31 @@ golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8T golang.org/x/xerrors v0.0.0-20191011141410-1b5146add898/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0= golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0= golang.org/x/xerrors v0.0.0-20200804184101-5ec99f83aff1/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0= +google.golang.org/appengine v1.1.0/go.mod h1:EbEs0AVv82hx2wNQdGPgUI5lhzA/G0D9YwlJXL52JkM= +google.golang.org/appengine v1.4.0/go.mod h1:xpcJRLb0r/rnEns0DIKYYv+WjYCduHsrkT7/EB5XEv4= +google.golang.org/genai v1.48.0 h1:1vb15G291wAjJJueisMDpUhssljhEdJU2t5qTidrVPs= +google.golang.org/genai v1.48.0/go.mod h1:A3kkl0nyBjyFlNjgxIwKq70julKbIxpSxqKO5gw/gmk= +google.golang.org/genproto v0.0.0-20180817151627-c66870c02cf8/go.mod h1:JiN7NxoALGmiZfu7CAH4rXhgtRTLTxftemlI0sWmxmc= +google.golang.org/genproto v0.0.0-20190819201941-24fa4b261c55/go.mod h1:DMBHOl98Agz4BDEuKkezgsaosCRResVns1a3J2ZsMNc= +google.golang.org/genproto v0.0.0-20200526211855-cb27e3aa2013/go.mod h1:NbSheEEYHJ7i3ixzK3sjbqSGDJWnxyFXZblF3eUsNvo= +google.golang.org/genproto/googleapis/rpc v0.0.0-20240903143218-8af14fe29dc1 h1:pPJltXNxVzT4pK9yD8vR9X75DaWYYmLGMsEvBfFQZzQ= +google.golang.org/genproto/googleapis/rpc v0.0.0-20240903143218-8af14fe29dc1/go.mod h1:UqMtugtsSgubUsoxbuAoiCXvqvErP7Gf0so0mK9tHxU= +google.golang.org/grpc v1.19.0/go.mod h1:mqu4LbDTu4XGKhr4mRzUsmM4RtVoemTSY81AxZiDr8c= +google.golang.org/grpc v1.23.0/go.mod h1:Y5yQAOtifL1yxbo5wqy6BxZv8vAUGQwXBOALyacEbxg= +google.golang.org/grpc v1.25.1/go.mod h1:c3i+UQWmh7LiEpx4sFZnkU36qjEYZ0imhYfXVyQciAY= +google.golang.org/grpc v1.27.0/go.mod h1:qbnxyOmOxrQa7FizSgH+ReBfzJrCY1pSN7KXBS8abTk= +google.golang.org/grpc v1.33.2/go.mod h1:JMHMWHQWaTccqQQlmk3MJZS+GWXOdAesneDmEnv2fbc= +google.golang.org/grpc v1.66.2 h1:3QdXkuq3Bkh7w+ywLdLvM56cmGvQHUMZpiCzt6Rqaoo= +google.golang.org/grpc v1.66.2/go.mod h1:s3/l6xSSCURdVfAnL+TqCNMyTDAGN6+lZeVxnZR128Y= google.golang.org/protobuf v0.0.0-20200109180630-ec00e32a8dfd/go.mod h1:DFci5gLYBciE7Vtevhsrf46CRTquxDuWsQurQQe4oz8= google.golang.org/protobuf v0.0.0-20200221191635-4d8936d0db64/go.mod h1:kwYJMbMJ01Woi6D6+Kah6886xMZcty6N08ah7+eCXa0= google.golang.org/protobuf v0.0.0-20200228230310-ab0ca4ff8a60/go.mod h1:cfTl7dwQJ+fmap5saPgwCLgHXTUD7jkjRqWcaiX5VyM= google.golang.org/protobuf v1.20.1-0.20200309200217-e05f789c0967/go.mod h1:A+miEFZTKqfCUM6K7xSMQL9OKL/b6hQv+e19PK+JZNE= google.golang.org/protobuf v1.21.0/go.mod h1:47Nbq4nVaFHyn7ilMalzfO3qCViNmqZ2kzikPIcrTAo= +google.golang.org/protobuf v1.22.0/go.mod h1:EGpADcykh3NcUnDUJcl1+ZksZNG86OlYog2l/sGQquU= google.golang.org/protobuf v1.23.0/go.mod h1:EGpADcykh3NcUnDUJcl1+ZksZNG86OlYog2l/sGQquU= +google.golang.org/protobuf v1.23.1-0.20200526195155-81db48ad09cc/go.mod h1:EGpADcykh3NcUnDUJcl1+ZksZNG86OlYog2l/sGQquU= +google.golang.org/protobuf v1.25.0/go.mod h1:9JNX74DMeImyA3h4bdi1ymwjUzf21/xIlbajtzgsN7c= google.golang.org/protobuf v1.26.0-rc.1/go.mod h1:jlhhOSvTdKEhbULTjvd4ARK9grFBp09yW+WbY/TyQbw= google.golang.org/protobuf v1.26.0/go.mod h1:9q0QmTI4eRPtz6boOQmLYwt+qCgq0jsYwAQnmE0givc= google.golang.org/protobuf v1.36.11 h1:fV6ZwhNocDyBLK0dj+fg8ektcVegBBuEolpbTQyBNVE= @@ -347,6 +416,8 @@ gopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ= gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM= gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA= gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM= +honnef.co/go/tools v0.0.0-20190102054323-c2f93a96b099/go.mod h1:rf3lG4BRIbNafJWhAfAdb/ePZxsR/4RtNHQocxwk9r4= +honnef.co/go/tools v0.0.0-20190523083050-ea95bdfd59fc/go.mod h1:rf3lG4BRIbNafJWhAfAdb/ePZxsR/4RtNHQocxwk9r4= modernc.org/cc/v4 v4.27.1 h1:9W30zRlYrefrDV2JE2O8VDtJ1yPGownxciz5rrbQZis= modernc.org/cc/v4 v4.27.1/go.mod h1:uVtb5OGqUKpoLWhqwNQo/8LwvoiEBLvZXIQ/SmO6mL0= modernc.org/ccgo/v4 v4.30.1 h1:4r4U1J6Fhj98NKfSjnPUN7Ze2c6MnAdL0hWw6+LrJpc= diff --git a/pkg/providers/factory_provider.go b/pkg/providers/factory_provider.go index 53f7a08a0d..38a6d4b53e 100644 --- a/pkg/providers/factory_provider.go +++ b/pkg/providers/factory_provider.go @@ -8,8 +8,11 @@ package providers import ( "fmt" "strings" + "time" "github.com/sipeed/picoclaw/pkg/config" + "github.com/sipeed/picoclaw/pkg/providers/gemini_sdk" + "github.com/sipeed/picoclaw/pkg/providers/openai_sdk" ) // createClaudeAuthProvider creates a Claude provider using OAuth credentials from auth store. @@ -76,7 +79,7 @@ func CreateProviderFromConfig(cfg *config.ModelConfig) (LLMProvider, string, err } return provider, modelID, nil } - // OpenAI with API key + // OpenAI with API key (official SDK path) if cfg.APIKey == "" && cfg.APIBase == "" { return nil, "", fmt.Errorf("api_key or api_base is required for HTTP-based protocol %q", protocol) } @@ -84,15 +87,14 @@ func CreateProviderFromConfig(cfg *config.ModelConfig) (LLMProvider, string, err if apiBase == "" { apiBase = getDefaultAPIBase(protocol) } - return NewHTTPProviderWithMaxTokensFieldAndRequestTimeout( + return openai_sdk.NewProvider( cfg.APIKey, apiBase, cfg.Proxy, - cfg.MaxTokensField, - cfg.RequestTimeout, + openai_sdk.WithRequestTimeout(time.Duration(cfg.RequestTimeout)*time.Second), ), modelID, nil - case "openrouter", "groq", "zhipu", "gemini", "nvidia", + case "openrouter", "groq", "zhipu", "nvidia", "ollama", "moonshot", "shengsuanyun", "deepseek", "cerebras", "volcengine", "vllm", "qwen", "mistral": // All other OpenAI-compatible HTTP providers @@ -111,6 +113,21 @@ func CreateProviderFromConfig(cfg *config.ModelConfig) (LLMProvider, string, err cfg.RequestTimeout, ), modelID, nil + case "gemini", "google": + if cfg.APIKey == "" && cfg.APIBase == "" { + return nil, "", fmt.Errorf("api_key or api_base is required for HTTP-based protocol %q", protocol) + } + apiBase := cfg.APIBase + if apiBase == "" { + apiBase = getDefaultAPIBase(protocol) + } + return gemini_sdk.NewProvider( + cfg.APIKey, + apiBase, + cfg.Proxy, + gemini_sdk.WithRequestTimeout(time.Duration(cfg.RequestTimeout)*time.Second), + ), modelID, nil + case "anthropic": if cfg.AuthMethod == "oauth" || cfg.AuthMethod == "token" { // Use OAuth credentials from auth store @@ -184,7 +201,7 @@ func getDefaultAPIBase(protocol string) string { return "https://api.groq.com/openai/v1" case "zhipu": return "https://open.bigmodel.cn/api/paas/v4" - case "gemini": + case "gemini", "google": return "https://generativelanguage.googleapis.com/v1beta" case "nvidia": return "https://integrate.api.nvidia.com/v1" diff --git a/pkg/providers/factory_provider_test.go b/pkg/providers/factory_provider_test.go index e0c0eddefc..9ab60cb412 100644 --- a/pkg/providers/factory_provider_test.go +++ b/pkg/providers/factory_provider_test.go @@ -13,6 +13,8 @@ import ( "time" "github.com/sipeed/picoclaw/pkg/config" + "github.com/sipeed/picoclaw/pkg/providers/gemini_sdk" + "github.com/sipeed/picoclaw/pkg/providers/openai_sdk" ) func TestExtractProtocol(t *testing.T) { @@ -94,6 +96,9 @@ func TestCreateProviderFromConfig_OpenAI(t *testing.T) { if provider == nil { t.Fatal("CreateProviderFromConfig() returned nil provider") } + if _, ok := provider.(*openai_sdk.Provider); !ok { + t.Fatalf("expected *openai_sdk.Provider, got %T", provider) + } if modelID != "gpt-4o" { t.Errorf("modelID = %q, want %q", modelID, "gpt-4o") } @@ -105,6 +110,7 @@ func TestCreateProviderFromConfig_DefaultAPIBase(t *testing.T) { protocol string }{ {"openai", "openai"}, + {"gemini", "gemini"}, {"groq", "groq"}, {"openrouter", "openrouter"}, {"cerebras", "cerebras"}, @@ -127,9 +133,22 @@ func TestCreateProviderFromConfig_DefaultAPIBase(t *testing.T) { t.Fatalf("CreateProviderFromConfig() error = %v", err) } - // Verify we got an HTTPProvider for all these protocols + if tt.protocol == "openai" { + if _, ok := provider.(*openai_sdk.Provider); !ok { + t.Fatalf("protocol=openai: expected *openai_sdk.Provider, got %T", provider) + } + return + } + if tt.protocol == "gemini" { + if _, ok := provider.(*gemini_sdk.Provider); !ok { + t.Fatalf("protocol=gemini: expected *gemini_sdk.Provider, got %T", provider) + } + return + } + + // Non-openai/non-gemini protocols remain on HTTPProvider. if _, ok := provider.(*HTTPProvider); !ok { - t.Fatalf("expected *HTTPProvider, got %T", provider) + t.Fatalf("protocol=%s: expected *HTTPProvider, got %T", tt.protocol, provider) } }) } @@ -172,6 +191,40 @@ func TestCreateProviderFromConfig_Antigravity(t *testing.T) { } } +func TestCreateProviderFromConfig_GeminiAndGoogle(t *testing.T) { + tests := []struct { + name string + model string + }{ + {name: "gemini protocol", model: "gemini/gemini-2.5-flash"}, + {name: "google protocol", model: "google/gemini-2.5-flash"}, + } + + for _, tt := range tests { + t.Run(tt.name, func(t *testing.T) { + cfg := &config.ModelConfig{ + ModelName: "test-gemini", + Model: tt.model, + APIKey: "test-key", + } + + provider, modelID, err := CreateProviderFromConfig(cfg) + if err != nil { + t.Fatalf("CreateProviderFromConfig() error = %v", err) + } + if provider == nil { + t.Fatal("CreateProviderFromConfig() returned nil provider") + } + if _, ok := provider.(*gemini_sdk.Provider); !ok { + t.Fatalf("expected *gemini_sdk.Provider, got %T", provider) + } + if modelID != "gemini-2.5-flash" { + t.Errorf("modelID = %q, want %q", modelID, "gemini-2.5-flash") + } + }) + } +} + func TestCreateProviderFromConfig_ClaudeCLI(t *testing.T) { cfg := &config.ModelConfig{ ModelName: "test-claude-cli", diff --git a/pkg/providers/gemini_sdk/provider.go b/pkg/providers/gemini_sdk/provider.go new file mode 100644 index 0000000000..9c5d208400 --- /dev/null +++ b/pkg/providers/gemini_sdk/provider.go @@ -0,0 +1,612 @@ +package gemini_sdk + +import ( + "context" + "encoding/base64" + "encoding/json" + "errors" + "fmt" + "log" + "net/http" + "net/url" + "regexp" + "strings" + "time" + + "google.golang.org/genai" + + "github.com/sipeed/picoclaw/pkg/providers/protocoltypes" +) + +type ( + ToolCall = protocoltypes.ToolCall + FunctionCall = protocoltypes.FunctionCall + LLMResponse = protocoltypes.LLMResponse + UsageInfo = protocoltypes.UsageInfo + Message = protocoltypes.Message + ToolDefinition = protocoltypes.ToolDefinition + ToolFunctionDefinition = protocoltypes.ToolFunctionDefinition + ExtraContent = protocoltypes.ExtraContent + GoogleExtra = protocoltypes.GoogleExtra +) + +const ( + defaultModel = "gemini-2.5-flash" + defaultRequestTimeout = 120 * time.Second + defaultGeminiAPIBase = "https://generativelanguage.googleapis.com" +) + +var apiVersionPattern = regexp.MustCompile(`^v[0-9]+(?:(?:alpha|beta)[0-9]*)?$`) + +type Provider struct { + apiBase string + apiVersion string + httpClient *http.Client + client *genai.Client + initErr error +} + +type Option func(*Provider) + +func WithRequestTimeout(timeout time.Duration) Option { + return func(p *Provider) { + if timeout > 0 { + p.httpClient.Timeout = timeout + } + } +} + +func NewProvider(apiKey, apiBase, proxy string, opts ...Option) *Provider { + httpClient := &http.Client{Timeout: defaultRequestTimeout} + if proxy != "" { + parsed, err := url.Parse(proxy) + if err == nil { + httpClient.Transport = &http.Transport{Proxy: http.ProxyURL(parsed)} + } else { + log.Printf("gemini_sdk: invalid proxy URL %q: %v", proxy, err) + } + } + + baseURL, apiVersion := normalizeAPIBase(apiBase) + p := &Provider{ + apiBase: baseURL, + apiVersion: apiVersion, + httpClient: httpClient, + } + for _, opt := range opts { + if opt != nil { + opt(p) + } + } + + clientConfig := &genai.ClientConfig{ + APIKey: apiKey, + Backend: genai.BackendGeminiAPI, + HTTPClient: p.httpClient, + HTTPOptions: genai.HTTPOptions{ + BaseURL: p.apiBase, + }, + } + if p.apiVersion != "" { + clientConfig.HTTPOptions.APIVersion = p.apiVersion + } + + client, err := genai.NewClient(context.Background(), clientConfig) + if err != nil { + p.initErr = err + log.Printf("gemini_sdk: failed to initialize genai client: %v", err) + return p + } + p.client = client + return p +} + +func (p *Provider) GetDefaultModel() string { + return defaultModel +} + +func (p *Provider) Chat( + ctx context.Context, + messages []Message, + tools []ToolDefinition, + model string, + options map[string]any, +) (*LLMResponse, error) { + if p.initErr != nil { + return nil, fmt.Errorf("failed to initialize Gemini SDK client: %w", p.initErr) + } + if p.client == nil { + return nil, fmt.Errorf("Gemini SDK client not initialized") + } + + contents, systemInstruction := buildGeminiContents(messages) + config := &genai.GenerateContentConfig{} + hasConfig := false + if systemInstruction != nil { + config.SystemInstruction = systemInstruction + hasConfig = true + } + if mappedTools := buildGeminiTools(tools); len(mappedTools) > 0 { + config.Tools = mappedTools + hasConfig = true + } + if applyOptions(config, options) { + hasConfig = true + } + if !hasConfig { + config = nil + } + + resp, err := p.client.Models.GenerateContent(ctx, normalizeModel(model), contents, config) + if err != nil { + var apiErr genai.APIError + if errors.As(err, &apiErr) { + return nil, fmt.Errorf( + "Gemini API request failed (status=%d): %s", + apiErr.Code, + strings.TrimSpace(apiErr.Message), + ) + } + return nil, fmt.Errorf("Gemini API request failed: %w", err) + } + if resp == nil || len(resp.Candidates) == 0 { + return &LLMResponse{ + Content: "", + FinishReason: "stop", + Usage: mapUsage(resp), + }, nil + } + + choice := resp.Candidates[0] + content, toolCalls := parseCandidate(choice) + return &LLMResponse{ + Content: content, + ToolCalls: toolCalls, + FinishReason: mapFinishReason(choice.FinishReason, len(toolCalls) > 0), + Usage: mapUsage(resp), + }, nil +} + +func normalizeAPIBase(apiBase string) (string, string) { + base := strings.TrimRight(strings.TrimSpace(apiBase), "/") + if base == "" { + return defaultGeminiAPIBase, "" + } + + parsed, err := url.Parse(base) + if err != nil { + return base, "" + } + + path := strings.Trim(parsed.Path, "/") + if path == "" { + return strings.TrimRight(parsed.String(), "/"), "" + } + + parts := strings.Split(path, "/") + version := parts[len(parts)-1] + if !apiVersionPattern.MatchString(version) { + return strings.TrimRight(parsed.String(), "/"), "" + } + + parts = parts[:len(parts)-1] + if len(parts) == 0 { + parsed.Path = "" + } else { + parsed.Path = "/" + strings.Join(parts, "/") + } + + return strings.TrimRight(parsed.String(), "/"), version +} + +func normalizeModel(model string) string { + trimmed := strings.TrimSpace(model) + lower := strings.ToLower(trimmed) + switch { + case strings.HasPrefix(lower, "gemini/"): + return trimmed[len("gemini/"):] + case strings.HasPrefix(lower, "google/"): + return trimmed[len("google/"):] + default: + return trimmed + } +} + +func buildGeminiContents(messages []Message) ([]*genai.Content, *genai.Content) { + contents := make([]*genai.Content, 0, len(messages)) + systemTexts := make([]string, 0) + toolCallNames := make(map[string]string) + + for _, msg := range messages { + switch msg.Role { + case "system": + systemText := extractSystemText(msg) + if systemText != "" { + systemTexts = append(systemTexts, systemText) + } + case "assistant": + modelContent := &genai.Content{Role: string(genai.RoleModel)} + if msg.Content != "" { + modelContent.Parts = append(modelContent.Parts, genai.NewPartFromText(msg.Content)) + } + for _, tc := range msg.ToolCalls { + name, args, thoughtSignature := normalizeStoredToolCall(tc) + if name == "" { + continue + } + if tc.ID != "" { + toolCallNames[tc.ID] = name + } + part := genai.NewPartFromFunctionCall(name, args) + if len(thoughtSignature) > 0 { + part.ThoughtSignature = thoughtSignature + } + modelContent.Parts = append(modelContent.Parts, part) + } + if len(modelContent.Parts) > 0 { + contents = append(contents, modelContent) + } + case "tool": + contents = appendToolResponseContent(contents, msg.ToolCallID, msg.Content, toolCallNames) + case "user": + if msg.ToolCallID != "" { + contents = appendToolResponseContent(contents, msg.ToolCallID, msg.Content, toolCallNames) + } else if msg.Content != "" { + contents = append(contents, genai.NewContentFromText(msg.Content, genai.RoleUser)) + } + default: + if msg.Content != "" { + contents = append(contents, genai.NewContentFromText(msg.Content, genai.RoleUser)) + } + } + } + + var systemInstruction *genai.Content + if len(systemTexts) > 0 { + systemInstruction = &genai.Content{ + Parts: []*genai.Part{ + genai.NewPartFromText(strings.Join(systemTexts, "\n\n")), + }, + } + } + + return contents, systemInstruction +} + +func extractSystemText(msg Message) string { + if strings.TrimSpace(msg.Content) != "" { + return msg.Content + } + if len(msg.SystemParts) == 0 { + return "" + } + + parts := make([]string, 0, len(msg.SystemParts)) + for _, part := range msg.SystemParts { + if strings.TrimSpace(part.Text) == "" { + continue + } + parts = append(parts, part.Text) + } + return strings.Join(parts, "\n\n") +} + +func appendToolResponseContent( + contents []*genai.Content, + toolCallID string, + content string, + toolCallNames map[string]string, +) []*genai.Content { + toolName := resolveToolResponseName(toolCallID, toolCallNames) + if toolName == "" { + return contents + } + resp := map[string]any{"result": content} + contents = append(contents, genai.NewContentFromFunctionResponse(toolName, resp, genai.RoleUser)) + return contents +} + +func normalizeStoredToolCall(tc ToolCall) (string, map[string]any, []byte) { + name := tc.Name + if name == "" && tc.Function != nil { + name = tc.Function.Name + } + + args := tc.Arguments + if args == nil { + args = map[string]any{} + } + if len(args) == 0 && tc.Function != nil && tc.Function.Arguments != "" { + var parsed map[string]any + if err := json.Unmarshal([]byte(tc.Function.Arguments), &parsed); err == nil && parsed != nil { + args = parsed + } + } + + return name, args, decodeThoughtSignature(extractStoredThoughtSignature(tc)) +} + +func extractStoredThoughtSignature(tc ToolCall) string { + if tc.ExtraContent != nil && tc.ExtraContent.Google != nil && tc.ExtraContent.Google.ThoughtSignature != "" { + return tc.ExtraContent.Google.ThoughtSignature + } + if tc.Function != nil && tc.Function.ThoughtSignature != "" { + return tc.Function.ThoughtSignature + } + if tc.ThoughtSignature != "" { + return tc.ThoughtSignature + } + return "" +} + +func decodeThoughtSignature(s string) []byte { + if strings.TrimSpace(s) == "" { + return nil + } + if b, err := base64.StdEncoding.DecodeString(s); err == nil { + return b + } + if b, err := base64.RawStdEncoding.DecodeString(s); err == nil { + return b + } + return []byte(s) +} + +func encodeThoughtSignature(sig []byte) string { + if len(sig) == 0 { + return "" + } + return base64.StdEncoding.EncodeToString(sig) +} + +func resolveToolResponseName(toolCallID string, toolCallNames map[string]string) string { + if toolCallID == "" { + return "" + } + if name, ok := toolCallNames[toolCallID]; ok && name != "" { + return name + } + return inferToolNameFromCallID(toolCallID) +} + +func inferToolNameFromCallID(toolCallID string) string { + if !strings.HasPrefix(toolCallID, "call_") { + return toolCallID + } + + rest := strings.TrimPrefix(toolCallID, "call_") + if idx := strings.LastIndex(rest, "_"); idx > 0 { + candidate := rest[:idx] + if candidate != "" { + return candidate + } + } + return toolCallID +} + +func buildGeminiTools(tools []ToolDefinition) []*genai.Tool { + declarations := make([]*genai.FunctionDeclaration, 0, len(tools)) + for _, tool := range tools { + if tool.Type != "function" || tool.Function.Name == "" { + continue + } + decl := &genai.FunctionDeclaration{ + Name: tool.Function.Name, + Description: tool.Function.Description, + } + if len(tool.Function.Parameters) > 0 { + decl.ParametersJsonSchema = sanitizeSchemaForGemini(tool.Function.Parameters) + } + declarations = append(declarations, decl) + } + if len(declarations) == 0 { + return nil + } + return []*genai.Tool{ + { + FunctionDeclarations: declarations, + }, + } +} + +func parseCandidate(candidate *genai.Candidate) (string, []ToolCall) { + if candidate == nil || candidate.Content == nil || len(candidate.Content.Parts) == 0 { + return "", nil + } + + textParts := make([]string, 0, len(candidate.Content.Parts)) + toolCalls := make([]ToolCall, 0) + for _, part := range candidate.Content.Parts { + if part == nil { + continue + } + if part.Text != "" { + textParts = append(textParts, part.Text) + } + if part.FunctionCall == nil { + continue + } + + name := part.FunctionCall.Name + args := part.FunctionCall.Args + if args == nil { + args = map[string]any{} + } + argsJSON, err := json.Marshal(args) + if err != nil { + argsJSON = []byte("{}") + } + + thoughtSignature := encodeThoughtSignature(part.ThoughtSignature) + toolCall := ToolCall{ + ID: part.FunctionCall.ID, + Type: "function", + Name: name, + Arguments: args, + Function: &FunctionCall{ + Name: name, + Arguments: string(argsJSON), + ThoughtSignature: thoughtSignature, + }, + ThoughtSignature: thoughtSignature, + } + if toolCall.ID == "" { + toolCall.ID = fmt.Sprintf("call_%s_%d", name, time.Now().UnixNano()) + } + if thoughtSignature != "" { + toolCall.ExtraContent = &ExtraContent{ + Google: &GoogleExtra{ + ThoughtSignature: thoughtSignature, + }, + } + } + toolCalls = append(toolCalls, toolCall) + } + + return strings.Join(textParts, ""), toolCalls +} + +func mapFinishReason(reason genai.FinishReason, hasToolCalls bool) string { + if hasToolCalls { + return "tool_calls" + } + if reason == genai.FinishReasonMaxTokens { + return "length" + } + return "stop" +} + +func mapUsage(resp *genai.GenerateContentResponse) *UsageInfo { + if resp == nil || resp.UsageMetadata == nil { + return nil + } + usage := resp.UsageMetadata + if usage.PromptTokenCount == 0 && usage.CandidatesTokenCount == 0 && usage.TotalTokenCount == 0 { + return nil + } + return &UsageInfo{ + PromptTokens: int(usage.PromptTokenCount), + CompletionTokens: int(usage.CandidatesTokenCount), + TotalTokens: int(usage.TotalTokenCount), + } +} + +func applyOptions(config *genai.GenerateContentConfig, options map[string]any) bool { + if config == nil || options == nil { + return false + } + + changed := false + if maxTokens, ok := asInt(options["max_tokens"]); ok && maxTokens > 0 { + config.MaxOutputTokens = int32(maxTokens) + changed = true + } + if temperature, ok := asFloat(options["temperature"]); ok { + temp := float32(temperature) + config.Temperature = &temp + changed = true + } + return changed +} + +func asInt(v any) (int, bool) { + switch x := v.(type) { + case int: + return x, true + case int8: + return int(x), true + case int16: + return int(x), true + case int32: + return int(x), true + case int64: + return int(x), true + case float64: + return int(x), true + case float32: + return int(x), true + default: + return 0, false + } +} + +func asFloat(v any) (float64, bool) { + switch x := v.(type) { + case float64: + return x, true + case float32: + return float64(x), true + case int: + return float64(x), true + case int8: + return float64(x), true + case int16: + return float64(x), true + case int32: + return float64(x), true + case int64: + return float64(x), true + default: + return 0, false + } +} + +var geminiUnsupportedKeywords = map[string]bool{ + "patternProperties": true, + "additionalProperties": true, + "$schema": true, + "$id": true, + "$ref": true, + "$defs": true, + "definitions": true, + "examples": true, + "minLength": true, + "maxLength": true, + "minimum": true, + "maximum": true, + "multipleOf": true, + "pattern": true, + "format": true, + "minItems": true, + "maxItems": true, + "uniqueItems": true, + "minProperties": true, + "maxProperties": true, +} + +func sanitizeSchemaForGemini(schema map[string]any) map[string]any { + if schema == nil { + return nil + } + + result := make(map[string]any) + for k, v := range schema { + if geminiUnsupportedKeywords[k] { + continue + } + switch val := v.(type) { + case map[string]any: + result[k] = sanitizeSchemaForGemini(val) + case []any: + sanitized := make([]any, len(val)) + for i, item := range val { + if m, ok := item.(map[string]any); ok { + sanitized[i] = sanitizeSchemaForGemini(m) + } else { + sanitized[i] = item + } + } + result[k] = sanitized + default: + result[k] = v + } + } + + if _, hasProps := result["properties"]; hasProps { + if _, hasType := result["type"]; !hasType { + result["type"] = "object" + } + } + + return result +} diff --git a/pkg/providers/gemini_sdk/provider_test.go b/pkg/providers/gemini_sdk/provider_test.go new file mode 100644 index 0000000000..fa6a1e8190 --- /dev/null +++ b/pkg/providers/gemini_sdk/provider_test.go @@ -0,0 +1,404 @@ +package gemini_sdk + +import ( + "encoding/json" + "net/http" + "net/http/httptest" + "strings" + "testing" + "time" +) + +func TestGeminiSDKProvider_Chat_BasicContentAndOptions(t *testing.T) { + var ( + requestPath string + requestBody map[string]any + ) + + server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + requestPath = r.URL.Path + if err := json.NewDecoder(r.Body).Decode(&requestBody); err != nil { + t.Fatalf("decode request body: %v", err) + } + + w.Header().Set("Content-Type", "application/json") + _, _ = w.Write([]byte(`{ + "candidates":[ + { + "content":{"parts":[{"text":"hello from gemini"}],"role":"model"}, + "finishReason":"STOP" + } + ], + "usageMetadata":{"promptTokenCount":12,"candidatesTokenCount":3,"totalTokenCount":15} + }`)) + })) + defer server.Close() + + p := NewProvider("test-key", server.URL, "") + resp, err := p.Chat( + t.Context(), + []Message{{Role: "user", Content: "hi"}}, + nil, + "google/gemini-2.5-flash", + map[string]any{ + "max_tokens": 123, + "temperature": 0.2, + "prompt_cache_key": "ignored-on-gemini", + }, + ) + if err != nil { + t.Fatalf("Chat() error = %v", err) + } + + if requestPath != "/v1beta/models/gemini-2.5-flash:generateContent" { + t.Fatalf("path = %q, want /v1beta/models/gemini-2.5-flash:generateContent", requestPath) + } + if got := readNestedNumber(requestBody, "generationConfig", "maxOutputTokens"); got != 123 { + t.Fatalf("generationConfig.maxOutputTokens = %v, want 123", got) + } + if got := readNestedNumber(requestBody, "generationConfig", "temperature"); got != 0.2 { + t.Fatalf("generationConfig.temperature = %v, want 0.2", got) + } + if _, ok := requestBody["prompt_cache_key"]; ok { + t.Fatalf("did not expect prompt_cache_key in Gemini request body") + } + + if resp.Content != "hello from gemini" { + t.Fatalf("Content = %q, want %q", resp.Content, "hello from gemini") + } + if resp.FinishReason != "stop" { + t.Fatalf("FinishReason = %q, want %q", resp.FinishReason, "stop") + } + if resp.Usage == nil || resp.Usage.TotalTokens != 15 { + t.Fatalf("Usage.TotalTokens = %+v, want 15", resp.Usage) + } +} + +func TestGeminiSDKProvider_Chat_ParsesToolCallsAndThoughtSignature(t *testing.T) { + server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + w.Header().Set("Content-Type", "application/json") + _, _ = w.Write([]byte(`{ + "candidates":[ + { + "content":{ + "parts":[ + { + "functionCall":{"name":"sum","args":{"a":1,"b":2}}, + "thoughtSignature":"YWJj" + } + ], + "role":"model" + }, + "finishReason":"STOP" + } + ] + }`)) + })) + defer server.Close() + + p := NewProvider("test-key", server.URL, "") + resp, err := p.Chat( + t.Context(), + []Message{{Role: "user", Content: "hi"}}, + nil, + "gemini-2.5-flash", + nil, + ) + if err != nil { + t.Fatalf("Chat() error = %v", err) + } + if resp.FinishReason != "tool_calls" { + t.Fatalf("FinishReason = %q, want %q", resp.FinishReason, "tool_calls") + } + if len(resp.ToolCalls) != 1 { + t.Fatalf("len(ToolCalls) = %d, want 1", len(resp.ToolCalls)) + } + + tc := resp.ToolCalls[0] + if tc.Name != "sum" { + t.Fatalf("ToolCalls[0].Name = %q, want sum", tc.Name) + } + if tc.Arguments["a"] != float64(1) { + t.Fatalf("ToolCalls[0].Arguments = %#v", tc.Arguments) + } + if tc.Function == nil { + t.Fatalf("ToolCalls[0].Function = nil, want non-nil") + } + if tc.Function.ThoughtSignature != "YWJj" { + t.Fatalf("Function.ThoughtSignature = %q, want %q", tc.Function.ThoughtSignature, "YWJj") + } + if tc.ExtraContent == nil || tc.ExtraContent.Google == nil { + t.Fatalf("ExtraContent.Google = %#v, want non-nil", tc.ExtraContent) + } + if tc.ExtraContent.Google.ThoughtSignature != "YWJj" { + t.Fatalf("ExtraContent.Google.ThoughtSignature = %q, want %q", tc.ExtraContent.Google.ThoughtSignature, "YWJj") + } +} + +func TestGeminiSDKProvider_Chat_HistoryThoughtSignaturePreference(t *testing.T) { + var requestBody map[string]any + server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + if err := json.NewDecoder(r.Body).Decode(&requestBody); err != nil { + t.Fatalf("decode request body: %v", err) + } + w.Header().Set("Content-Type", "application/json") + _, _ = w.Write([]byte(`{ + "candidates":[ + { + "content":{"parts":[{"text":"ok"}],"role":"model"}, + "finishReason":"STOP" + } + ] + }`)) + })) + defer server.Close() + + p := NewProvider("test-key", server.URL, "") + _, err := p.Chat( + t.Context(), + []Message{ + { + Role: "assistant", + ToolCalls: []ToolCall{ + { + ID: "call_1", + Name: "sum", + Arguments: map[string]any{"a": 1}, + Function: &FunctionCall{ + Name: "sum", + Arguments: `{"a":1}`, + ThoughtSignature: "ZnVuY3Rpb24=", + }, + ExtraContent: &ExtraContent{ + Google: &GoogleExtra{ + ThoughtSignature: "ZXh0cmE=", + }, + }, + }, + }, + }, + { + Role: "tool", + ToolCallID: "call_1", + Content: `{"result": 1}`, + }, + { + Role: "user", + Content: "continue", + }, + }, + nil, + "gemini-2.5-flash", + nil, + ) + if err != nil { + t.Fatalf("Chat() error = %v", err) + } + + contents, ok := requestBody["contents"].([]any) + if !ok { + t.Fatalf("contents type = %T, want []any", requestBody["contents"]) + } + if len(contents) < 1 { + t.Fatalf("contents length = %d, want >= 1", len(contents)) + } + + assistant := contents[0].(map[string]any) + parts, ok := assistant["parts"].([]any) + if !ok || len(parts) == 0 { + t.Fatalf("assistant parts = %#v, want non-empty", assistant["parts"]) + } + firstPart := parts[0].(map[string]any) + if got := stringValue(firstPart, "thoughtSignature"); got != "ZXh0cmE=" { + t.Fatalf("thoughtSignature = %q, want extra_content value %q", got, "ZXh0cmE=") + } +} + +func TestGeminiSDKProvider_Chat_HistoryThoughtSignatureFallbackToFunction(t *testing.T) { + var requestBody map[string]any + server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + if err := json.NewDecoder(r.Body).Decode(&requestBody); err != nil { + t.Fatalf("decode request body: %v", err) + } + w.Header().Set("Content-Type", "application/json") + _, _ = w.Write([]byte(`{ + "candidates":[ + { + "content":{"parts":[{"text":"ok"}],"role":"model"}, + "finishReason":"STOP" + } + ] + }`)) + })) + defer server.Close() + + p := NewProvider("test-key", server.URL, "") + _, err := p.Chat( + t.Context(), + []Message{ + { + Role: "assistant", + ToolCalls: []ToolCall{ + { + ID: "call_1", + Name: "sum", + Arguments: map[string]any{"a": 1}, + Function: &FunctionCall{ + Name: "sum", + Arguments: `{"a":1}`, + ThoughtSignature: "ZnVuY3Rpb24=", + }, + }, + }, + }, + { + Role: "tool", + ToolCallID: "call_1", + Content: `{"result": 1}`, + }, + { + Role: "user", + Content: "continue", + }, + }, + nil, + "gemini-2.5-flash", + nil, + ) + if err != nil { + t.Fatalf("Chat() error = %v", err) + } + + contents, ok := requestBody["contents"].([]any) + if !ok { + t.Fatalf("contents type = %T, want []any", requestBody["contents"]) + } + if len(contents) < 1 { + t.Fatalf("contents length = %d, want >= 1", len(contents)) + } + + assistant := contents[0].(map[string]any) + parts, ok := assistant["parts"].([]any) + if !ok || len(parts) == 0 { + t.Fatalf("assistant parts = %#v, want non-empty", assistant["parts"]) + } + firstPart := parts[0].(map[string]any) + if got := stringValue(firstPart, "thoughtSignature"); got != "ZnVuY3Rpb24=" { + t.Fatalf("thoughtSignature = %q, want function value %q", got, "ZnVuY3Rpb24=") + } +} + +func TestGeminiSDKProvider_Chat_APIBaseWithVersionPath(t *testing.T) { + var requestPath string + server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + requestPath = r.URL.Path + w.Header().Set("Content-Type", "application/json") + _, _ = w.Write([]byte(`{ + "candidates":[ + { + "content":{"parts":[{"text":"ok"}],"role":"model"}, + "finishReason":"STOP" + } + ] + }`)) + })) + defer server.Close() + + p := NewProvider("test-key", server.URL+"/v1beta", "") + _, err := p.Chat( + t.Context(), + []Message{{Role: "user", Content: "hi"}}, + nil, + "gemini-2.5-flash", + nil, + ) + if err != nil { + t.Fatalf("Chat() error = %v", err) + } + if requestPath != "/v1beta/models/gemini-2.5-flash:generateContent" { + t.Fatalf("path = %q, want /v1beta/models/gemini-2.5-flash:generateContent", requestPath) + } +} + +func TestGeminiSDKProvider_ProxyConfig(t *testing.T) { + p := NewProvider("test-key", "http://example.com", "http://127.0.0.1:10080") + transport, ok := p.httpClient.Transport.(*http.Transport) + if !ok || transport.Proxy == nil { + t.Fatalf("expected proxy transport, got %#v", p.httpClient.Transport) + } +} + +func TestGeminiSDKProvider_Chat_Timeout(t *testing.T) { + server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + time.Sleep(700 * time.Millisecond) + w.Header().Set("Content-Type", "application/json") + _, _ = w.Write([]byte(`{ + "candidates":[ + { + "content":{"parts":[{"text":"ok"}],"role":"model"}, + "finishReason":"STOP" + } + ] + }`)) + })) + defer server.Close() + + p := NewProvider("test-key", server.URL, "", WithRequestTimeout(200*time.Millisecond)) + _, err := p.Chat( + t.Context(), + []Message{{Role: "user", Content: "hi"}}, + nil, + "gemini-2.5-flash", + nil, + ) + if err == nil { + t.Fatalf("Chat() error = nil, want timeout") + } + if !strings.Contains(err.Error(), "timeout") && + !strings.Contains(err.Error(), "deadline exceeded") && + !strings.Contains(err.Error(), "Client.Timeout exceeded") { + t.Fatalf("timeout error = %q", err.Error()) + } +} + +func TestGeminiSDKProvider_Chat_HTTPError(t *testing.T) { + server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + w.Header().Set("Content-Type", "application/json") + w.WriteHeader(http.StatusBadRequest) + _, _ = w.Write([]byte(`{"error":{"code":400,"message":"bad request","status":"INVALID_ARGUMENT"}}`)) + })) + defer server.Close() + + p := NewProvider("test-key", server.URL, "") + _, err := p.Chat( + t.Context(), + []Message{{Role: "user", Content: "hi"}}, + nil, + "gemini-2.5-flash", + nil, + ) + if err == nil { + t.Fatal("Chat() expected error") + } + if !strings.Contains(err.Error(), "status=400") { + t.Fatalf("error = %q, want status=400", err.Error()) + } +} + +func readNestedNumber(v map[string]any, keys ...string) float64 { + current := any(v) + for _, key := range keys { + m, ok := current.(map[string]any) + if !ok { + return 0 + } + current = m[key] + } + num, _ := current.(float64) + return num +} + +func stringValue(v map[string]any, key string) string { + s, _ := v[key].(string) + return s +} diff --git a/pkg/providers/openai_compat/provider.go b/pkg/providers/openai_compat/provider.go index 5dab9b03ee..4555c42673 100644 --- a/pkg/providers/openai_compat/provider.go +++ b/pkg/providers/openai_compat/provider.go @@ -150,16 +150,10 @@ func (p *Provider) Chat( } } - // Prompt caching: pass a stable cache key so OpenAI can bucket requests - // with the same key and reuse prefix KV cache across calls. - // The key is typically the agent ID — stable per agent, shared across requests. - // See: https://platform.openai.com/docs/guides/prompt-caching - // Prompt caching is only supported by OpenAI-native endpoints. - // Gemini and other providers reject unknown fields, so skip for non-OpenAI APIs. + // Prompt caching: pass a stable cache key so compatible endpoints that + // support this field can reuse prefix KV cache across calls. if cacheKey, ok := options["prompt_cache_key"].(string); ok && cacheKey != "" { - if !strings.Contains(p.apiBase, "generativelanguage.googleapis.com") { - requestBody["prompt_cache_key"] = cacheKey - } + requestBody["prompt_cache_key"] = cacheKey } jsonData, err := json.Marshal(requestBody) @@ -289,10 +283,11 @@ func parseResponse(body []byte) (*LLMResponse, error) { // It mirrors protocoltypes.Message but omits SystemParts, which is an // internal field that would be unknown to third-party endpoints. type openaiMessage struct { - Role string `json:"role"` - Content string `json:"content"` - ToolCalls []ToolCall `json:"tool_calls,omitempty"` - ToolCallID string `json:"tool_call_id,omitempty"` + Role string `json:"role"` + Content string `json:"content"` + ReasoningContent string `json:"reasoning_content,omitempty"` + ToolCalls []ToolCall `json:"tool_calls,omitempty"` + ToolCallID string `json:"tool_call_id,omitempty"` } // stripSystemParts converts []Message to []openaiMessage, dropping the @@ -302,10 +297,11 @@ func stripSystemParts(messages []Message) []openaiMessage { out := make([]openaiMessage, len(messages)) for i, m := range messages { out[i] = openaiMessage{ - Role: m.Role, - Content: m.Content, - ToolCalls: m.ToolCalls, - ToolCallID: m.ToolCallID, + Role: m.Role, + Content: m.Content, + ReasoningContent: m.ReasoningContent, + ToolCalls: m.ToolCalls, + ToolCallID: m.ToolCallID, } } return out @@ -323,7 +319,7 @@ func normalizeModel(model, apiBase string) string { prefix := strings.ToLower(model[:idx]) switch prefix { - case "moonshot", "nvidia", "groq", "ollama", "deepseek", "google", "openrouter", "zhipu", "mistral": + case "moonshot", "nvidia", "groq", "ollama", "deepseek", "openrouter", "zhipu", "mistral": return model[idx+1:] default: return model diff --git a/pkg/providers/openai_sdk/provider.go b/pkg/providers/openai_sdk/provider.go new file mode 100644 index 0000000000..404b23fc68 --- /dev/null +++ b/pkg/providers/openai_sdk/provider.go @@ -0,0 +1,313 @@ +package openai_sdk + +import ( + "context" + "encoding/json" + "errors" + "fmt" + "log" + "net/http" + "net/url" + "strings" + "time" + + "github.com/openai/openai-go/v3" + "github.com/openai/openai-go/v3/option" + "github.com/openai/openai-go/v3/shared" + + "github.com/sipeed/picoclaw/pkg/providers/protocoltypes" +) + +type ( + ToolCall = protocoltypes.ToolCall + FunctionCall = protocoltypes.FunctionCall + LLMResponse = protocoltypes.LLMResponse + UsageInfo = protocoltypes.UsageInfo + Message = protocoltypes.Message + ToolDefinition = protocoltypes.ToolDefinition + ToolFunctionDefinition = protocoltypes.ToolFunctionDefinition +) + +const ( + defaultModel = "gpt-4o" + defaultRequestTimeout = 120 * time.Second +) + +type Provider struct { + apiBase string + httpClient *http.Client + client *openai.Client +} + +type Option func(*Provider) + +func WithRequestTimeout(timeout time.Duration) Option { + return func(p *Provider) { + if timeout > 0 { + p.httpClient.Timeout = timeout + } + } +} + +func NewProvider(apiKey, apiBase, proxy string, opts ...Option) *Provider { + httpClient := &http.Client{Timeout: defaultRequestTimeout} + if proxy != "" { + parsed, err := url.Parse(proxy) + if err == nil { + httpClient.Transport = &http.Transport{Proxy: http.ProxyURL(parsed)} + } else { + log.Printf("openai_sdk: invalid proxy URL %q: %v", proxy, err) + } + } + + p := &Provider{ + apiBase: strings.TrimRight(apiBase, "/"), + httpClient: httpClient, + } + for _, opt := range opts { + if opt != nil { + opt(p) + } + } + + reqOpts := []option.RequestOption{ + option.WithBaseURL(p.apiBase), + option.WithHTTPClient(p.httpClient), + } + if apiKey != "" { + reqOpts = append(reqOpts, option.WithAPIKey(apiKey)) + } + client := openai.NewClient(reqOpts...) + p.client = &client + return p +} + +func (p *Provider) GetDefaultModel() string { + return defaultModel +} + +func (p *Provider) Chat( + ctx context.Context, + messages []Message, + tools []ToolDefinition, + model string, + options map[string]any, +) (*LLMResponse, error) { + if strings.TrimSpace(p.apiBase) == "" { + return nil, fmt.Errorf("API base not configured") + } + + params := openai.ChatCompletionNewParams{ + Model: normalizeModel(model), + Messages: buildChatMessages(messages), + } + + if len(tools) > 0 { + params.Tools = buildChatTools(tools) + params.ToolChoice.OfAuto = openai.String(string(openai.ChatCompletionToolChoiceOptionAutoAuto)) + } + applyOptions(¶ms, options) + + resp, err := p.client.Chat.Completions.New(ctx, params) + if err != nil { + var apiErr *openai.Error + if errors.As(err, &apiErr) { + return nil, fmt.Errorf( + "OpenAI API request failed (status=%d): %s", + apiErr.StatusCode, + strings.TrimSpace(apiErr.Message), + ) + } + return nil, fmt.Errorf("OpenAI API request failed: %w", err) + } + if resp == nil || len(resp.Choices) == 0 { + return nil, fmt.Errorf("OpenAI API returned no choices") + } + + choice := resp.Choices[0] + return &LLMResponse{ + Content: choice.Message.Content, + ToolCalls: parseChoiceToolCalls(choice.Message.ToolCalls), + FinishReason: choice.FinishReason, + Usage: mapUsage(resp.Usage), + }, nil +} + +func normalizeModel(model string) string { + trimmed := strings.TrimSpace(model) + if strings.HasPrefix(strings.ToLower(trimmed), "openai/") { + return trimmed[len("openai/"):] + } + return trimmed +} + +func buildChatMessages(messages []Message) []openai.ChatCompletionMessageParamUnion { + out := make([]openai.ChatCompletionMessageParamUnion, 0, len(messages)) + for _, msg := range messages { + switch msg.Role { + case "system": + out = append(out, openai.SystemMessage(msg.Content)) + case "assistant": + out = append(out, buildAssistantMessage(msg)) + case "tool": + out = append(out, openai.ToolMessage(msg.Content, msg.ToolCallID)) + case "user": + fallthrough + default: + out = append(out, openai.UserMessage(msg.Content)) + } + } + return out +} + +func buildAssistantMessage(msg Message) openai.ChatCompletionMessageParamUnion { + assistant := openai.ChatCompletionAssistantMessageParam{} + if msg.Content != "" { + assistant.Content.OfString = openai.String(msg.Content) + } + if len(msg.ToolCalls) > 0 { + assistant.ToolCalls = make([]openai.ChatCompletionMessageToolCallUnionParam, 0, len(msg.ToolCalls)) + for _, tc := range msg.ToolCalls { + name := tc.Name + if name == "" && tc.Function != nil { + name = tc.Function.Name + } + if name == "" { + continue + } + args := "{}" + if len(tc.Arguments) > 0 { + if b, err := json.Marshal(tc.Arguments); err == nil { + args = string(b) + } + } + assistant.ToolCalls = append(assistant.ToolCalls, openai.ChatCompletionMessageToolCallUnionParam{ + OfFunction: &openai.ChatCompletionMessageFunctionToolCallParam{ + ID: tc.ID, + Function: openai.ChatCompletionMessageFunctionToolCallFunctionParam{ + Name: name, + Arguments: args, + }, + }, + }) + } + } + return openai.ChatCompletionMessageParamUnion{OfAssistant: &assistant} +} + +func buildChatTools(tools []ToolDefinition) []openai.ChatCompletionToolUnionParam { + out := make([]openai.ChatCompletionToolUnionParam, 0, len(tools)) + for _, tool := range tools { + if tool.Function.Name == "" { + continue + } + fn := shared.FunctionDefinitionParam{ + Name: tool.Function.Name, + Description: openai.String(tool.Function.Description), + Parameters: shared.FunctionParameters(tool.Function.Parameters), + } + out = append(out, openai.ChatCompletionFunctionTool(fn)) + } + return out +} + +func parseChoiceToolCalls(calls []openai.ChatCompletionMessageToolCallUnion) []ToolCall { + if len(calls) == 0 { + return nil + } + + result := make([]ToolCall, 0, len(calls)) + for _, call := range calls { + switch v := call.AsAny().(type) { + case openai.ChatCompletionMessageFunctionToolCall: + args := map[string]any{} + if strings.TrimSpace(v.Function.Arguments) != "" { + if err := json.Unmarshal([]byte(v.Function.Arguments), &args); err != nil { + log.Printf("openai_sdk: failed to decode tool call arguments for %q: %v", v.Function.Name, err) + } + } + result = append(result, ToolCall{ + ID: v.ID, + Type: "function", + Function: &FunctionCall{ + Name: v.Function.Name, + Arguments: v.Function.Arguments, + }, + Name: v.Function.Name, + Arguments: args, + }) + } + } + return result +} + +func applyOptions( + params *openai.ChatCompletionNewParams, + options map[string]any, +) { + if params == nil || options == nil { + return + } + if maxTokens, ok := asInt(options["max_tokens"]); ok { + params.MaxCompletionTokens = openai.Opt(int64(maxTokens)) + } + if temp, ok := asFloat(options["temperature"]); ok { + params.Temperature = openai.Opt(temp) + } + if cacheKey, ok := options["prompt_cache_key"].(string); ok && strings.TrimSpace(cacheKey) != "" { + params.PromptCacheKey = openai.String(cacheKey) + } +} + +func mapUsage(usage openai.CompletionUsage) *UsageInfo { + if usage.TotalTokens == 0 && usage.PromptTokens == 0 && usage.CompletionTokens == 0 { + return nil + } + return &UsageInfo{ + PromptTokens: int(usage.PromptTokens), + CompletionTokens: int(usage.CompletionTokens), + TotalTokens: int(usage.TotalTokens), + } +} + +func asInt(v any) (int, bool) { + switch x := v.(type) { + case int: + return x, true + case int8: + return int(x), true + case int16: + return int(x), true + case int32: + return int(x), true + case int64: + return int(x), true + case float64: + return int(x), true + case float32: + return int(x), true + default: + return 0, false + } +} + +func asFloat(v any) (float64, bool) { + switch x := v.(type) { + case float64: + return x, true + case float32: + return float64(x), true + case int: + return float64(x), true + case int8: + return float64(x), true + case int16: + return float64(x), true + case int32: + return float64(x), true + case int64: + return float64(x), true + default: + return 0, false + } +} diff --git a/pkg/providers/openai_sdk/provider_test.go b/pkg/providers/openai_sdk/provider_test.go new file mode 100644 index 0000000000..2f848dfc74 --- /dev/null +++ b/pkg/providers/openai_sdk/provider_test.go @@ -0,0 +1,306 @@ +package openai_sdk + +import ( + "encoding/json" + "net/http" + "net/http/httptest" + "strings" + "testing" + "time" +) + +func TestOpenAISDKProvider_Chat_BasicContent(t *testing.T) { + server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + if r.URL.Path != "/chat/completions" { + t.Fatalf("path = %q, want /chat/completions", r.URL.Path) + } + + var body map[string]any + if err := json.NewDecoder(r.Body).Decode(&body); err != nil { + t.Fatalf("decode request body: %v", err) + } + if body["model"] != "gpt-4o" { + t.Fatalf("request model = %v, want gpt-4o", body["model"]) + } + + w.Header().Set("Content-Type", "application/json") + _, _ = w.Write([]byte(`{ + "id":"chatcmpl-123", + "object":"chat.completion", + "created":1, + "model":"gpt-4o", + "choices":[{"index":0,"finish_reason":"stop","message":{"role":"assistant","content":"hello"}}], + "usage":{"prompt_tokens":10,"completion_tokens":2,"total_tokens":12} + }`)) + })) + defer server.Close() + + p := NewProvider("test-key", server.URL, "") + resp, err := p.Chat( + t.Context(), + []Message{{Role: "user", Content: "hi"}}, + nil, + "gpt-4o", + nil, + ) + if err != nil { + t.Fatalf("Chat() error = %v", err) + } + if resp.Content != "hello" { + t.Fatalf("Content = %q, want %q", resp.Content, "hello") + } + if resp.FinishReason != "stop" { + t.Fatalf("FinishReason = %q, want %q", resp.FinishReason, "stop") + } + if resp.Usage == nil || resp.Usage.TotalTokens != 12 { + t.Fatalf("Usage.TotalTokens = %+v, want 12", resp.Usage) + } +} + +func TestOpenAISDKProvider_Chat_MessageAndToolMapping(t *testing.T) { + var body map[string]any + + server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + if err := json.NewDecoder(r.Body).Decode(&body); err != nil { + t.Fatalf("decode body: %v", err) + } + w.Header().Set("Content-Type", "application/json") + _, _ = w.Write([]byte(`{ + "id":"chatcmpl-123", + "object":"chat.completion", + "created":1, + "model":"gpt-4o", + "choices":[{"index":0,"finish_reason":"stop","message":{"role":"assistant","content":"ok"}}] + }`)) + })) + defer server.Close() + + p := NewProvider("test-key", server.URL, "") + _, err := p.Chat( + t.Context(), + []Message{ + {Role: "system", Content: "sys"}, + {Role: "assistant", Content: "thinking", ToolCalls: []ToolCall{ + { + ID: "call_1", + Name: "sum", + Arguments: map[string]any{ + "a": 1, + "b": 2, + }, + }, + }}, + {Role: "tool", ToolCallID: "call_1", Content: `{"result":3}`}, + {Role: "user", Content: "hi"}, + }, + []ToolDefinition{ + { + Type: "function", + Function: ToolFunctionDefinition{ + Name: "sum", + Description: "sum two integers", + Parameters: map[string]any{ + "type": "object", + }, + }, + }, + }, + "openai/gpt-4o", + nil, + ) + if err != nil { + t.Fatalf("Chat() error = %v", err) + } + + if body["model"] != "gpt-4o" { + t.Fatalf("request model = %v, want gpt-4o", body["model"]) + } + + msgs, ok := body["messages"].([]any) + if !ok { + t.Fatalf("messages type = %T, want []any", body["messages"]) + } + if len(msgs) != 4 { + t.Fatalf("messages length = %d, want 4", len(msgs)) + } + + assistantMsg := msgs[1].(map[string]any) + if assistantMsg["role"] != "assistant" { + t.Fatalf("assistant role = %v, want assistant", assistantMsg["role"]) + } + toolCalls, ok := assistantMsg["tool_calls"].([]any) + if !ok || len(toolCalls) != 1 { + t.Fatalf("assistant tool_calls = %#v, want len 1", assistantMsg["tool_calls"]) + } + toolMsg := msgs[2].(map[string]any) + if toolMsg["role"] != "tool" || toolMsg["tool_call_id"] != "call_1" { + t.Fatalf("tool message mismatch: %#v", toolMsg) + } + + tools, ok := body["tools"].([]any) + if !ok || len(tools) != 1 { + t.Fatalf("tools = %#v, want len 1", body["tools"]) + } +} + +func TestOpenAISDKProvider_Chat_ParsesResponseToolCalls(t *testing.T) { + server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + w.Header().Set("Content-Type", "application/json") + _, _ = w.Write([]byte(`{ + "id":"chatcmpl-123", + "object":"chat.completion", + "created":1, + "model":"gpt-4o", + "choices":[ + { + "index":0, + "finish_reason":"tool_calls", + "message":{ + "role":"assistant", + "content":"", + "tool_calls":[ + { + "id":"call_1", + "type":"function", + "function":{"name":"sum","arguments":"{\"a\":1,\"b\":2}"} + } + ] + } + } + ] + }`)) + })) + defer server.Close() + + p := NewProvider("test-key", server.URL, "") + resp, err := p.Chat( + t.Context(), + []Message{{Role: "user", Content: "hi"}}, + nil, + "gpt-4o", + nil, + ) + if err != nil { + t.Fatalf("Chat() error = %v", err) + } + if len(resp.ToolCalls) != 1 { + t.Fatalf("ToolCalls length = %d, want 1", len(resp.ToolCalls)) + } + if resp.ToolCalls[0].Name != "sum" { + t.Fatalf("ToolCalls[0].Name = %q, want sum", resp.ToolCalls[0].Name) + } + if resp.ToolCalls[0].Arguments["a"] != float64(1) { + t.Fatalf("ToolCalls[0].Arguments = %#v", resp.ToolCalls[0].Arguments) + } +} + +func TestOpenAISDKProvider_Chat_OptionsMapping(t *testing.T) { + var body map[string]any + server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + if err := json.NewDecoder(r.Body).Decode(&body); err != nil { + t.Fatalf("decode body: %v", err) + } + w.Header().Set("Content-Type", "application/json") + _, _ = w.Write([]byte(`{ + "id":"chatcmpl-123", + "object":"chat.completion", + "created":1, + "model":"gpt-4o", + "choices":[{"index":0,"finish_reason":"stop","message":{"role":"assistant","content":"ok"}}] + }`)) + })) + defer server.Close() + + p := NewProvider("test-key", server.URL, "") + _, err := p.Chat( + t.Context(), + []Message{{Role: "user", Content: "hi"}}, + nil, + "gpt-4o", + map[string]any{ + "max_tokens": 123, + "temperature": 0.2, + "prompt_cache_key": "agent-1", + }, + ) + if err != nil { + t.Fatalf("Chat() error = %v", err) + } + if body["max_completion_tokens"] != float64(123) { + t.Fatalf("max_completion_tokens = %v, want 123", body["max_completion_tokens"]) + } + if _, ok := body["max_tokens"]; ok { + t.Fatalf("did not expect max_tokens in request body") + } + if body["temperature"] != 0.2 { + t.Fatalf("temperature = %v, want 0.2", body["temperature"]) + } + if body["prompt_cache_key"] != "agent-1" { + t.Fatalf("prompt_cache_key = %v, want agent-1", body["prompt_cache_key"]) + } +} + +func TestOpenAISDKProvider_Chat_Timeout(t *testing.T) { + server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + time.Sleep(700 * time.Millisecond) + w.Header().Set("Content-Type", "application/json") + _, _ = w.Write([]byte(`{ + "id":"chatcmpl-123", + "object":"chat.completion", + "created":1, + "model":"gpt-4o", + "choices":[{"index":0,"finish_reason":"stop","message":{"role":"assistant","content":"ok"}}] + }`)) + })) + defer server.Close() + + p := NewProvider("test-key", server.URL, "", WithRequestTimeout(200*time.Millisecond)) + _, err := p.Chat( + t.Context(), + []Message{{Role: "user", Content: "hi"}}, + nil, + "gpt-4o", + nil, + ) + if err == nil { + t.Fatalf("Chat() error = nil, want timeout") + } + if !strings.Contains(err.Error(), "timeout") && + !strings.Contains(err.Error(), "deadline exceeded") && + !strings.Contains(err.Error(), "Client.Timeout exceeded") { + t.Fatalf("timeout error = %q", err.Error()) + } +} + +func TestOpenAISDKProvider_ProxyConfig(t *testing.T) { + p := NewProvider("test-key", "http://example.com/v1", "http://127.0.0.1:10080") + transport, ok := p.httpClient.Transport.(*http.Transport) + if !ok || transport.Proxy == nil { + t.Fatalf("expected proxy transport, got %#v", p.httpClient.Transport) + } +} + +func TestOpenAISDKProvider_Chat_HTTPError(t *testing.T) { + server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + w.Header().Set("Content-Type", "application/json") + w.WriteHeader(http.StatusBadRequest) + _, _ = w.Write([]byte(`{"error":{"message":"bad request"}}`)) + })) + defer server.Close() + + p := NewProvider("test-key", server.URL, "") + _, err := p.Chat( + t.Context(), + []Message{{Role: "user", Content: "hi"}}, + nil, + "gpt-4o", + nil, + ) + if err == nil { + t.Fatal("Chat() expected error") + } + errMsg := err.Error() + if !strings.Contains(errMsg, "status=400") { + t.Fatalf("error = %q, want status=400", errMsg) + } +}