-
Notifications
You must be signed in to change notification settings - Fork 94
Fetch assistant prompt configuration from Langfuse #177
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from 2 commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
|
|
@@ -62,6 +62,7 @@ | |||||
| prompt, | ||||||
| model:, | ||||||
| instructions: nil, | ||||||
| instructions_prompt: nil, | ||||||
| functions: [], | ||||||
| function_results: [], | ||||||
| streamer: nil, | ||||||
|
|
@@ -112,6 +113,7 @@ | |||||
| model: model, | ||||||
| input: input_payload, | ||||||
| output: response.messages.map(&:output_text).join("\n"), | ||||||
| prompt: instructions_prompt, | ||||||
| session_id: session_id, | ||||||
| user_identifier: user_identifier | ||||||
| ) | ||||||
|
|
@@ -123,6 +125,7 @@ | |||||
| model: model, | ||||||
| input: input_payload, | ||||||
| output: parsed.messages.map(&:output_text).join("\n"), | ||||||
| prompt: instructions_prompt, | ||||||
| usage: raw_response["usage"], | ||||||
| session_id: session_id, | ||||||
| user_identifier: user_identifier | ||||||
|
|
@@ -141,26 +144,41 @@ | |||||
| @langfuse_client = Langfuse.new | ||||||
| end | ||||||
|
|
||||||
| def log_langfuse_generation(name:, model:, input:, output:, usage: nil, session_id: nil, user_identifier: nil) | ||||||
| return unless langfuse_client | ||||||
|
|
||||||
| trace = langfuse_client.trace( | ||||||
| name: "openai.#{name}", | ||||||
| input: input, | ||||||
| session_id: session_id, | ||||||
| user_id: user_identifier | ||||||
| ) | ||||||
| trace.generation( | ||||||
| name: name, | ||||||
| model: model, | ||||||
| input: input, | ||||||
| output: output, | ||||||
| usage: usage, | ||||||
| session_id: session_id, | ||||||
| user_id: user_identifier | ||||||
| ) | ||||||
| trace.update(output: output) | ||||||
| rescue => e | ||||||
| Rails.logger.warn("Langfuse logging failed: #{e.message}") | ||||||
| def log_langfuse_generation(name:, model:, input:, output:, usage: nil, session_id: nil, user_identifier: nil, prompt: nil) | ||||||
|
||||||
| def log_langfuse_generation(name:, model:, input:, output:, usage: nil, session_id: nil, user_identifier: nil, prompt: nil) | |
| def log_langfuse_generation(name:, model:, input:, output:, usage: nil, session_id: nil, user_identifier: nil, prompt: nil) |
🧰 Tools
🪛 GitHub Check: ci / lint
[failure] 147-147:
Layout/IndentationConsistency: Inconsistent indentation detected.
🤖 Prompt for AI Agents
In app/models/provider/openai.rb around line 147, the method definition `def
log_langfuse_generation...` is indented inconsistently; reformat the line to
follow Ruby style (2-space indentation) so it aligns with the other method
definitions in the class/module scope — move the `def` left to the same
indentation level as neighboring methods and ensure the parameter list and
closing `end` remain aligned accordingly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
Extract duplicated
langfuse_clientto a shared concern.This method is nearly identical to
Provider::OpenAI#langfuse_client(lines 140-144 inapp/models/provider/openai.rb). To follow DRY principles and maintain consistency, consider extracting this to a shared concern (e.g.,LangfuseClientConcern) that both classes can include.Create a new concern at
app/models/concerns/langfuse_client_concern.rb:Then include it in both
Assistant::ConfigurableandProvider::OpenAI:As per coding guidelines.
🤖 Prompt for AI Agents