Skip to content

Does this example with FastAPI + LangSmith tracing actually work? (FastAPI [standard] 0.119.1+, LangChain 1.0.1+, LangSmith 0.4.37+) #2082

@JulioPeixoto

Description

@JulioPeixoto

I am trying to set up tracing with LangSmith in a FastAPI app and I’d like to check if the following minimal example should work, or whether I am missing something essential. why doesn't it work?

  • fastapi[standard] >= 0.119.1
  • langchain >= 1.0.1
  • langchain-openai >= 1.0.0
  • langsmith >= 0.4.37
import os
from fastapi import FastAPI
from fastapi.responses import RedirectResponse
from pydantic import BaseModel
from langchain_openai import ChatOpenAI
from config import settings
from langsmith import traceable

os.environ["LANGSMITH_API_KEY"] = "lsv2_pt_…"
os.environ["LANGSMITH_PROJECT"] = "pr-..."

model = ChatOpenAI(
    model=settings.MODEL_DEPLOYMENT,
    api_key=settings.OPENAI_API_KEY,
)

@traceable
def chat_service(question: str) -> str:
    return model.invoke([HumanMessage(content=question)])

app = FastAPI()

@app.post("/chat")
async def chat(request: ChatRequest):
    return chat_service(request.message)

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions