Add PerplexityLLMAdapter for message ordering strictness #4009
Codecov / codecov/project
succeeded
Mar 12, 2026 in 1s
40.46% (+0.05%) compared to 1c676c2
View this Pull Request on Codecov
40.46% (+0.05%) compared to 1c676c2
Details
Codecov Report
❌ Patch coverage is 80.39216% with 10 lines in your changes missing coverage. Please review.
| Files with missing lines | Coverage Δ | |
|---|---|---|
| ...rc/pipecat/adapters/services/perplexity_adapter.py | 100.00% <100.00%> (ø) |
|
| src/pipecat/services/openai/base_llm.py | 58.06% <ø> (ø) |
|
| src/pipecat/services/cerebras/llm.py | 63.63% <0.00%> (-4.11%) |
⬇️ |
| src/pipecat/services/fireworks/llm.py | 63.63% <0.00%> (-4.11%) |
⬇️ |
| src/pipecat/services/mistral/llm.py | 35.21% <0.00%> (-1.03%) |
⬇️ |
| src/pipecat/services/perplexity/llm.py | 43.47% <60.00%> (+1.29%) |
⬆️ |
| src/pipecat/services/sambanova/llm.py | 51.61% <0.00%> (-1.14%) |
⬇️ |
... and 7 files with indirect coverage changes
🚀 New features to boost your workflow:
- ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
Loading