Page cover

OpenRouter

OpenRouter is a unified platform that provides developers with seamless access to a wide range of large language models (LLMs) through a single OpenAI-compatible API interface. It enables model interoperability by abstracting away provider-specific differences, allowing users to easily route requests between open-source and proprietary models while managing authentication, quotas, and usage tracking in one place.

Arcee AI's foundation models are included in OpenRouter's supported models, making it quick and easy to get started. This tutorial will guide you through utilizing Arcee AI's language models on OpenRouter using an OpenAI-compatible endpoint.

Prerequisites

  • OpenRouter Account

    • If you don't have an account, set one up here.

  • OpenRouter API Key

    • If you don't have an API Key, create one here.

Integration Steps

  1. Create a folder for your project

mkdir arceeai_openrouter && cd arceeai_openrouter
  1. Setup Python Virtual Environment and Install OpenAI Client

curl -LsSf https://astral.sh/uv/install.sh | sh
source $HOME/.local/bin/env

uv venv --python 3.12 --seed
source .venv/bin/activate

uv pip install openai
  1. Create a new python file called arceeai_openrouter.py and copy in the following contents

from openai import OpenAI

client = OpenAI(
  base_url="https://openrouter.ai/api/v1",
  api_key="<OPENROUTER_API_KEY>",
)

completion = client.chat.completions.create(
  model="arcee-ai/afm-4.5b", # Replace with the Arcee AI model you want to invoke
  messages=[
    {
      "role": "user",
      "content": "What are small language models and how do they compare to LLMs?"
    }
  ]
)
print(completion.choices[0].message.content)
  1. Run your Arcee AI powered OpenRouter Completion

python arceeai_openrouter.py

Last updated