Skip to content

feat: Add support for local network accessible models#19

Open
bmaltais wants to merge 1 commit intoNousResearch:mainfrom
bmaltais:feature/local-model-api-support
Open

feat: Add support for local network accessible models#19
bmaltais wants to merge 1 commit intoNousResearch:mainfrom
bmaltais:feature/local-model-api-support

Conversation

@bmaltais
Copy link
Copy Markdown

This PR adds support for local network accessible models (vLLM, Ollama, etc.) to the Hermes Agent self-evolution tools. Changes include api_base and api_key parameters to EvolutionConfig, updated DSPy LM initializations, and CLI options for --api-base and --api-key.

- Add api_base and api_key parameters to EvolutionConfig
- Update all DSPy LM initializations to use custom endpoints
- Add --api-base and --api-key CLI options to external_importers and evolve_skill
- Enable vLLM, Ollama, and other local model endpoints
- Update documentation for local model configuration

This allows the self-evolution tools to work with:
- Local vLLM instances
- Ollama endpoints
- Any LiteLLM-compatible API
- Cost savings on expensive cloud APIs
Copilot AI review requested due to automatic review settings April 13, 2026 12:44
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds configurable api_base / api_key plumbing so the self-evolution tooling can target locally/network-hosted OpenAI-compatible endpoints (e.g., vLLM/Ollama) instead of only default provider endpoints.

Changes:

  • Extend EvolutionConfig with api_base and api_key, and thread them through the evolution flow.
  • Pass api_base / api_key into DSPy LM initialization for dataset generation, relevance scoring, and LLM-as-judge scoring.
  • Add CLI flags --api-base / --api-key to both the evolution runner and the external session importer.

Reviewed changes

Copilot reviewed 5 out of 5 changed files in this pull request and generated 2 comments.

Show a summary per file
File Description
evolution/core/config.py Adds api_base / api_key to the central run configuration.
evolution/skills/evolve_skill.py Exposes CLI flags and config wiring; uses configured base/key when creating the primary DSPy LM.
evolution/core/dataset_builder.py Uses configured base/key for synthetic dataset generation LM.
evolution/core/fitness.py Uses configured base/key for LLM-as-judge scoring LM.
evolution/core/external_importers.py Adds CLI flags and threads base/key into relevance-scoring LM and dataset build entrypoint.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +495 to +499
lm = dspy.LM(
self.model,
api_base=self.api_base,
api_key=self.api_key,
)
Copy link

Copilot AI Apr 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

RelevanceFilter.filter_and_score() now reads self.api_base / self.api_key unconditionally when building the DSPy LM. Existing unit tests (and any callers that construct the object without running __init__, e.g. via __new__) will raise AttributeError because those attributes may not exist. Make this robust by defaulting to None via getattr(self, "api_base", None) / getattr(self, "api_key", None) (or define class-level defaults), and ideally only pass these kwargs to dspy.LM when non-None.

Suggested change
lm = dspy.LM(
self.model,
api_base=self.api_base,
api_key=self.api_key,
)
api_base = getattr(self, "api_base", None)
api_key = getattr(self, "api_key", None)
lm_kwargs = {"model": self.model}
if api_base is not None:
lm_kwargs["api_base"] = api_base
if api_key is not None:
lm_kwargs["api_key"] = api_key
lm = dspy.LM(**lm_kwargs)

Copilot uses AI. Check for mistakes.
Comment on lines 141 to +151
console.print(f"\n[bold]Configuring optimizer[/bold]")
console.print(f" Optimizer: GEPA ({iterations} iterations)")
console.print(f" Optimizer model: {optimizer_model}")
console.print(f" Eval model: {eval_model}")

# Configure DSPy
lm = dspy.LM(eval_model)
lm = dspy.LM(
eval_model,
api_base=config.api_base,
api_key=config.api_key,
)
Copy link

Copilot AI Apr 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

optimizer_model is accepted via CLI, stored in EvolutionConfig, printed, and written to metrics, but it is not used to configure any DSPy LM or optimizer behavior here (the configured LM uses eval_model). As a result, --optimizer-model appears to have no effect. Either wire optimizer_model into the LM/optimizer configuration (if supported) or remove/rename the option to avoid misleading users.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants