Page cover

LangGraph

LangGraph is an open-source framework for building stateful, multi-agent applications powered by LLMs. It extends the LangChain ecosystem by enabling developers to define agents and workflows as dynamic computation graphs, where each node represents a function or agent and edges define conditional logic and transitions. Designed for both flexibility and scalability, LangGraph excels at use cases requiring memory, branching logic, and cyclic behavior—such as agent collaboration, simulations, or tool-augmented reasoning.

This tutorial will guide you through integrating Arcee AI language models into LangGraph using an OpenAI-compatible endpoint, allowing you to leverage Arcee's specialized models within LangGraph's agent framework.

Prerequisites

  • Python: >=3.10 and <3.14

Integration Steps

  1. Create a folder for your project

mkdir arceeai_langgraph && cd arceeai_langgraph
  1. Setup Python Virtual Environment and Install LangGraph tooling

curl -LsSf https://astral.sh/uv/install.sh | sh
source $HOME/.local/bin/env

uv venv --python 3.12 --seed
source .venv/bin/activate

uv pip install --pre -U langgraph langchain-openai
  1. Create a new python file called arceeai_langgraph.py and copy in the following contents

import os
from typing_extensions import Annotated, TypedDict

from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
from langchain_core.messages import AnyMessage, SystemMessage, HumanMessage
from langchain_openai import ChatOpenAI

class State(TypedDict):
    messages: Annotated[list[AnyMessage], add_messages]

# Configure Arcee AI Model
ARCEE_BASE = os.getenv("OPENAI_API_BASE", "http://127.0.0.1:8080/v1")
ARCEE_KEY  = os.getenv("OPENAI_API_KEY", "your-arcee-api-key")
ARCEE_MODEL = os.getenv("OPENAI_MODEL_NAME", "afm-4.5b")

# Initialize Arcee AI model with OpenAI-compatible configuration
arcee_llm = ChatOpenAI(
    model=ARCEE_MODEL,
    api_key=ARCEE_KEY,
    base_url=ARCEE_BASE,
)

# Define a simple graph node that uses the Arcee AI model
def summarize(state: State) -> State:
    # Add a system instruction once, if not present
    msgs = state["messages"]
    if not any(getattr(m, "role", "") == "system" for m in msgs):
        msgs = [SystemMessage(content="You are a concise technical writer.")] + msgs
    # Ask Arcee to respond
    ai = arcee_llm.invoke(msgs)
    return {"messages": [ai]}

# Build the graph that LangGraph will execute
builder = StateGraph(State)
builder.add_node("summarize", summarize)
builder.add_edge(START, "summarize")
builder.add_edge("summarize", END)
graph = builder.compile()

if __name__ == "__main__":
    text = """Arcee AI is a foundation model provider with a focus on building the highest performing models per parameter. 
They offer a range of models from on-device and edge optimized models to large language models. Their suite of models 
provides customers with the flexibility to choose the right model for the right task. All models are released Apache 2.0 
enabling the community to use safe, built-in-the-US models in their own environment or via the Arcee AI API platform."""
    inputs: State = {"messages": [HumanMessage(content=f"Summarize the following in three bullets:\n\n{text}")]}

    # Execute the LangGraph agent
    result = graph.invoke(inputs)
    
    # Print the results
    print("\n=== RESULT ===\n")
    print(result["messages"][-1].content)
  1. Run your Arcee AI powered LangGraph Agent

python arceeai_langgraph.py

Last updated