LangGraph trace in LangSmith doesn't trace past the first level #1680
-
SummaryHey, I only just started to use LangSmith to log traces, and I'm having an issue: instead of seeing all node calls and their respective LLM calls like shown in the docs: I only see the "top level" of the traces, but no details:
However, I do see the calls individually:
I tested with Python 3.11 and it worked:
Environment
Therefore, this is very likely related to #712 and #720, but I couldn't ascertain how to work around the problem in 3.10 in a "non-hacky" way from those issues, and @hinthornw seemed to be working on this. CodeHere is some MVP code similar to what I'm doing import asyncio
import random
import sys
import uuid
from enum import Enum
from functools import partial
from operator import add
from typing import Annotated
from langchain.prompts import PromptTemplate
from langchain_openai import ChatOpenAI
from langgraph.graph import END, START, StateGraph
from pydantic import BaseModel, Field
class Kinds(str, Enum):
A = "A"
B = "B"
C = "C"
class Detection(BaseModel):
kind: Kinds | None
reasoning: str
class GraphInputState(BaseModel):
text: str
text_id: int = Field(
default_factory=partial(random.randint, a=0, b=sys.maxsize)
)
class GraphState(GraphInputState):
detections: Annotated[list[Detection], add] = Field(default_factory=list)
builder = StateGraph(GraphState, input=GraphInputState)
def detector_factory(kind: Kinds):
prompt = PromptTemplate.from_template(
"State if {text} contains references to {kind}.",
partial_variables={"kind": kind},
)
llm = ChatOpenAI(model="gpt-4o").with_structured_output(Detection)
chain = prompt | llm
async def detect(state: GraphState):
detection = await chain.ainvoke({"text": state.text})
return {"detections": [detection]}
detect.__name__ = f"detect_{kind.name.casefold()}"
return detect
built_detectors = [detector_factory(kind) for kind in Kinds]
for built_detector in built_detectors:
builder.add_node(built_detector)
for built_detector in built_detectors:
builder.add_edge(START, built_detector.__name__)
builder.add_edge(built_detector.__name__, END)
graph = builder.compile()
def batch_analyze(list_of_texts_to_analyze):
async def process_all():
tasks = []
for text in list_of_texts_to_analyze:
tasks.append(graph.ainvoke(GraphInputState(text=text)))
results = await asyncio.gather(*tasks)
return [GraphState.model_validate(result) for result in results]
return asyncio.run(process_all())
batch_analyze(["test A", "test B"])What am I doing wrong? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
|
In Python versions earlier than 3.11, You'll either have to upgrade your python version or propagate contextvars manually. I'd recommened reading our troubleshooting guide here for more information. https://docs.smith.langchain.com/observability/how_to_guides/nest_traces |
Beta Was this translation helpful? Give feedback.




Hi @sebastian-correa
In Python versions earlier than 3.11,
asynciotasks lack propercontextvarsupport, which can lead to disconnected traces.You'll either have to upgrade your python version or propagate contextvars manually. I'd recommened reading our troubleshooting guide here for more information. https://docs.smith.langchain.com/observability/how_to_guides/nest_traces