Skip to content

Conversation

@Pouyanpi
Copy link
Collaborator

Description

The LLM isolation code was called too early in __init__, before all components were fully initialized. This caused flow matching to fail when trying to resolve rail flow IDs. This issue was introduce in #1342

Move _create_isolated_llms_for_actions() call to after KB setup to ensure all initialization is complete before creating isolated LLMs.

@Pouyanpi Pouyanpi added this to the v0.16.0 milestone Aug 22, 2025
@Pouyanpi Pouyanpi requested a review from Copilot August 22, 2025 10:34
@Pouyanpi Pouyanpi self-assigned this Aug 22, 2025
@Pouyanpi Pouyanpi added the bug Something isn't working label Aug 22, 2025
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

Fixes a timing issue where LLM isolation setup was called too early in initialization, causing flow matching to fail when trying to resolve rail flow IDs. The fix moves the LLM isolation call to after knowledge base initialization to ensure all components are properly set up.

  • Moved _create_isolated_llms_for_actions() call from _init_llms() to after KB setup in __init__()
  • Added error handling for flow matching failures with fallback to all actions requiring LLMs
  • Updated tests to remove prefix matching functionality and test the timing fix scenario

Reviewed Changes

Copilot reviewed 4 out of 4 changed files in this pull request and generated 3 comments.

File Description
nemoguardrails/rails/llm/llmrails.py Moves LLM isolation setup and adds error handling for flow matching
nemoguardrails/rails/llm/utils.py Removes prefix matching logic from flow ID resolution
tests/test_rails_llm_utils.py Updates tests to remove prefix matching test cases
tests/test_llm_isolation.py Adds test for timing issue with empty flows scenario

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

@Pouyanpi Pouyanpi force-pushed the fix/flow-id-not-found-error branch from 4a0bb86 to 4971484 Compare August 22, 2025 10:38
The LLM isolation code was called too early in __init__, before all
components were fully initialized. This caused flow matching to fail
when trying to resolve rail flow IDs.

Move _create_isolated_llms_for_actions() call to after KB setup to
ensure all initialization is complete before creating isolated LLMs.
@Pouyanpi Pouyanpi force-pushed the fix/flow-id-not-found-error branch from 4971484 to 1fd71cc Compare August 22, 2025 10:40
@codecov-commenter
Copy link

codecov-commenter commented Aug 22, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 71.59%. Comparing base (d1b35c8) to head (494dae3).
⚠️ Report is 1 commits behind head on develop.

Additional details and impacted files
@@           Coverage Diff            @@
##           develop    #1348   +/-   ##
========================================
  Coverage    71.59%   71.59%           
========================================
  Files          168      168           
  Lines        16861    16862    +1     
========================================
+ Hits         12071    12072    +1     
  Misses        4790     4790           
Flag Coverage Δ
python 71.59% <100.00%> (+<0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Files with missing lines Coverage Δ
nemoguardrails/rails/llm/llmrails.py 90.73% <100.00%> (+0.09%) ⬆️
nemoguardrails/rails/llm/utils.py 100.00% <100.00%> (ø)
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@Pouyanpi Pouyanpi requested a review from tgasser-nv August 22, 2025 14:32
Copy link
Collaborator

@tgasser-nv tgasser-nv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added a couple of questions, could you take a look before merging? Also could you run a local test with a config that gave the error before the fix, and make sure it's fixed after this PR will be merged?

@Pouyanpi
Copy link
Collaborator Author

Pouyanpi commented Aug 22, 2025

Also could you run a local test with a config that gave the error before the fix, and make sure it's fixed after this PR will be merged?

on develop branch:

from nemoguardrails import LLMRails, RailsConfig

config = RailsConfig.from_path("./examples/configs/nemoguards")

rails = LLMRails(config)
/Library/Caches/pypoetry/virtualenvs/nemoguardrails-BZdXtzBq-py3.12/lib/python3.12/site-packages/langchain_nvidia_ai_endpoints/_common.py:212: UserWarning: Found nvidia/llama-3.1-nemoguard-8b-content-safety in available_models, but type is unknown and inference may fail.
  warnings.warn(
Library/Caches/pypoetry/virtualenvs/nemoguardrails-BZdXtzBq-py3.12/lib/python3.12/site-packages/langchain_nvidia_ai_endpoints/_common.py:212: UserWarning: Found nvidia/llama-3.1-nemoguard-8b-topic-control in available_models, but type is unknown and inference may fail.
  warnings.warn(
Failed to create isolated LLMs for actions: No action found for flow_id: content safety check input $model=content_safety

after the fix:

from nemoguardrails import LLMRails, RailsConfig

config = RailsConfig.from_path("./examples/configs/nemoguards")

rails = LLMRails(config)
/Library/Caches/pypoetry/virtualenvs/nemoguardrails-BZdXtzBq-py3.12/lib/python3.12/site-packages/langchain_nvidia_ai_endpoints/_common.py:212: UserWarning: Found nvidia/llama-3.1-nemoguard-8b-content-safety in available_models, but type is unknown and inference may fail.
  warnings.warn(
/Library/Caches/pypoetry/virtualenvs/nemoguardrails-BZdXtzBq-py3.12/lib/python3.12/site-packages/langchain_nvidia_ai_endpoints/_common.py:212: UserWarning: Found nvidia/llama-3.1-nemoguard-8b-topic-control in available_models, but type is unknown and inference may fail.
  warnings.warn(

@Pouyanpi Pouyanpi merged commit 3fdd65d into develop Aug 22, 2025
17 checks passed
@Pouyanpi Pouyanpi deleted the fix/flow-id-not-found-error branch August 22, 2025 16:00
Pouyanpi added a commit that referenced this pull request Aug 25, 2025
…1348)

* fix(llmrails): move LLM isolation setup to after KB initialization

The LLM isolation code was called too early in __init__, before all
components were fully initialized. This caused flow matching to fail
when trying to resolve rail flow IDs.

Move _create_isolated_llms_for_actions() call to after KB setup to
ensure all initialization is complete before creating isolated LLMs.
Pouyanpi added a commit that referenced this pull request Oct 1, 2025
…1348)

* fix(llmrails): move LLM isolation setup to after KB initialization

The LLM isolation code was called too early in __init__, before all
components were fully initialized. This caused flow matching to fail
when trying to resolve rail flow IDs.

Move _create_isolated_llms_for_actions() call to after KB setup to
ensure all initialization is complete before creating isolated LLMs.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants