Page cover

LangSmith

LangSmith is an end-to-end platform for debugging, evaluating, and monitoring language model applications. It integrates seamlessly with LangChain and other LLM frameworks to visualize traces, inspect prompts, measure performance, and manage production-grade evaluation workflows. LangSmith helps you build more reliable AI systems by turning opaque model calls into structured, traceable data through observability and experiment tracking.

This tutorial shows how to integrate Arcee AI models into LangSmith using an OpenAI-compatible endpoint. While the focus is on tracing, the same setup applies to other LangSmith features.


Prerequisites

  • Python: >=3.10 and <3.14

  • LangSmith Account and API Key:

    • If you do not have one, you can create one here

  • Arcee AI model running locally or accessible via API and an OpenAI-compatible endpoint

Quickstart

Environment and project setup:

# Create project folder
mkdir arceeai_langsmith && cd arceeai_langsmith

# Install uv (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh
source $HOME/.local/bin/env

# Create and activate virtual environment
uv venv --python 3.12 --seed
source .venv/bin/activate

# Install LangSmith + LangChain OpenAI client
uv pip install langsmith langchain-openai

LangSmith uses environment variables for configuration. We'll also include the Arcee AI variables here. Create a .env file and include the following

Create a new python file called arceeai_langsmith.py with the following:

  1. Run your Arcee AI powered LangGraph Agent with LangSmith Tracing

Last updated