CrewAI
CrewAI is an open-source framework designed to orchestrate AI agents. CrewAI empowers agents to work together seamlessly, tackling complex tasks and multi-agent workflows. With built-in support for LiteLLM, CrewAI allows you to connect with many different Language Models (LLMs), including Arcee models via OpenAI-compatible endpoints.
This tutorial will walk you through setting up Arcee AI as your LLM provider inside a CrewAI agent pipeline.
Prerequisites
Python:
>=3.10 and <3.14Arcee AI model running locally or accessible via API and an OpenAI-compatible endpoint
Quickstart
Environment and project setup:
# Create project folder
mkdir arceeai_crewai && cd arceeai_crewai
# Install uv (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh
source $HOME/.local/bin/env
# Create and activate virtual environment
uv venv --python 3.12 --seed
source .venv/bin/activate
# Install CrewAI
uv pip install crewai
If you run into any errors installing CrewAI, follow their Installation Guide.
Create a new python file called arceeai_crewai.py with the following:
This works out-of-the-box if you have an Arcee AI model running locally on your laptop. If you do not, change ARCEE_BASE , ARCEE_KEY , and ARCEE_MODEL .
You can also setup a .env file to store the configurations.
Run your Arcee AI powered CrewAI Agent
Last updated


