Skip to content

Conversation

@mattt
Copy link
Owner

@mattt mattt commented Dec 10, 2025

Related to #53

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds a configuration option for llama.cpp logging by introducing a LogLevel enum and a global static property to control log verbosity. The implementation addresses issue #53 by allowing developers to filter llama.cpp log output through a custom callback function.

Key Changes:

  • Added LogLevel enum with five levels (none, debug, info, warn, error) that map to ggml log levels
  • Implemented a global log callback function (llamaLogCallback) that filters messages based on the configured level
  • Added static logLevel property on LlamaLanguageModel that configures the logging behavior for all instances

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

This comment was marked as outdated.

@mattt mattt force-pushed the mattt/llama-log-level branch from 3d9bfbf to 596dfc4 Compare December 10, 2025 12:53
@mattt mattt merged commit 4be4168 into main Dec 10, 2025
3 checks passed
@mattt mattt deleted the mattt/llama-log-level branch December 10, 2025 12:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants