Skip to content

Add PerplexityLLMAdapter for message ordering strictness#4009

Merged
kompfner merged 5 commits intomainfrom
pk/perplexity-message-ordering-strictness
Mar 12, 2026
Merged

Add PerplexityLLMAdapter for message ordering strictness#4009
kompfner merged 5 commits intomainfrom
pk/perplexity-message-ordering-strictness

Conversation

@kompfner
Copy link
Contributor

@kompfner kompfner commented Mar 12, 2026

Summary

  • Added PerplexityLLMAdapter that transforms conversation messages to satisfy Perplexity's stricter API constraints before sending requests. Perplexity requires strict role alternation (user/tool ↔ assistant), no non-initial system messages, and the last message to be user/tool — conversation histories that work fine with OpenAI can cause errors with Perplexity (PerplexityLLMService subclasses OpenAILLMService).
  • The adapter converts non-initial system messages to user, merges consecutive same-role messages (preserving consecutive initial system messages, which Perplexity allows), and removes trailing assistant messages.
  • Initial system messages are intentionally never converted to "user" — Perplexity appears to have statefulness within a conversation, so a message that was "user" in one call but becomes "system" in the next causes errors.
  • PerplexityLLMService now uses this adapter via adapter_class = PerplexityLLMAdapter.
  • Added missing dual-system-instruction warnings to Cerebras, Fireworks, Mistral, Perplexity, and SambaNova services, matching the existing BaseOpenAILLMService behavior.

Testing

uv run pytest tests/test_get_llm_invocation_params.py -v

11 test cases cover: standard passthrough, initial system preservation, multiple initial system preservation, consecutive same-role merging, non-initial system conversion, trailing assistant removal, system-only context preserved, system exposed after trailing assistant removal, consecutive assistant merge+removal, tool message preservation, and empty messages.

🤖 Generated with Claude Code

kompfner added a commit that referenced this pull request Mar 12, 2026
@codecov
Copy link

codecov bot commented Mar 12, 2026

Codecov Report

❌ Patch coverage is 80.39216% with 10 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
src/pipecat/services/cerebras/llm.py 0.00% 2 Missing ⚠️
src/pipecat/services/fireworks/llm.py 0.00% 2 Missing ⚠️
src/pipecat/services/mistral/llm.py 0.00% 2 Missing ⚠️
src/pipecat/services/perplexity/llm.py 60.00% 2 Missing ⚠️
src/pipecat/services/sambanova/llm.py 0.00% 2 Missing ⚠️
Files with missing lines Coverage Δ
...rc/pipecat/adapters/services/perplexity_adapter.py 100.00% <100.00%> (ø)
src/pipecat/services/openai/base_llm.py 58.06% <ø> (ø)
src/pipecat/services/cerebras/llm.py 63.63% <0.00%> (-4.11%) ⬇️
src/pipecat/services/fireworks/llm.py 63.63% <0.00%> (-4.11%) ⬇️
src/pipecat/services/mistral/llm.py 35.21% <0.00%> (-1.03%) ⬇️
src/pipecat/services/perplexity/llm.py 43.47% <60.00%> (+1.29%) ⬆️
src/pipecat/services/sambanova/llm.py 51.61% <0.00%> (-1.14%) ⬇️

... and 7 files with indirect coverage changes

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

kompfner added a commit that referenced this pull request Mar 12, 2026
@kompfner kompfner force-pushed the pk/perplexity-message-ordering-strictness branch from 102fa69 to cced0b7 Compare March 12, 2026 18:36
…straints

Perplexity's API is stricter than OpenAI about conversation history:
- Requires strict alternation between user/tool and assistant messages
- Disallows system messages except as the initial message
- Requires the last message to be user or tool

The new adapter transforms messages before sending to satisfy all three
constraints: merging consecutive initial system messages, converting
non-initial system to user, merging consecutive same-role messages, and
removing trailing assistant messages.

Also adds dual-system-instruction warnings to Cerebras, Fireworks,
Mistral, Perplexity, and SambaNova services (matching the existing
BaseOpenAILLMService pattern), and updates the warning text in
BaseOpenAILLMService to be more descriptive.
@kompfner kompfner force-pushed the pk/perplexity-message-ordering-strictness branch from cced0b7 to e4bf628 Compare March 12, 2026 18:56
Perplexity allows multiple initial system messages, so don't merge them.
Instead, skip system-system pairs during the consecutive same-role merge
step. Broaden the trailing message fix to convert any trailing system
message to user (not just a lone system message), so contexts with only
system messages don't fail.
Add test exercising the step 3 ordering where stripping a trailing
assistant exposes a system message that then gets converted to user.
Move the reasoning about when a trailing system message can occur
into the docstring.
@kompfner kompfner marked this pull request as draft March 12, 2026 19:37
Perplexity appears to have statefulness within a conversation, so
converting a system message to "user" in one call and then back to
"system" in the next (after more messages are appended) causes API
errors. Remove the trailing system→user conversion entirely — if the
context only has system messages, the API call will fail but the
mistake will be caught right away.
@kompfner kompfner force-pushed the pk/perplexity-message-ordering-strictness branch from 8a8bb38 to 99f2812 Compare March 12, 2026 20:08
@kompfner kompfner marked this pull request as ready for review March 12, 2026 20:08
@aconchillo
Copy link
Contributor

LGTM!

@kompfner kompfner merged commit 30d95e3 into main Mar 12, 2026
6 checks passed
@kompfner kompfner deleted the pk/perplexity-message-ordering-strictness branch March 12, 2026 20:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants