Page cover

Together.ai

Together.ai is an open platform for running, fine-tuning, and deploying large language models (LLMs) with high performance and low latency. It provides access to a curated collection of language models, such as Arcee AI's. Beyond inference, Together.ai supports distributed fine-tuning, model evaluation, and custom deployments, making it a flexible choice for teams building production-grade AI applications.

Arcee AI's foundation models are included in Together.ai's supported models, making it quick and easy to get started. This tutorial will guide you through utilizing Arcee AI's language models on Together.ai using Together.ai's SDK.

Prerequisites

  • Together.ai Account

  • Together.ai API Key

    • If you don't have one, create one here.

Integration Steps

  1. Create a folder for your project

mkdir arceeai_together && cd arceeai_together
  1. Setup Python Virtual Environment and Install Together Client

curl -LsSf https://astral.sh/uv/install.sh | sh
source $HOME/.local/bin/env

uv venv --python 3.12 --seed
source .venv/bin/activate

uv pip install together
  1. Create a new python file called arceeai_together.py and copy in the following contents

from together import Together

client = Together(
    api_key="<YOUR_TOGETHER_API_KEY>"
)

response = client.chat.completions.create(
    model="arcee-ai/AFM-4.5B", # Replace with the Arcee AI model you want to invoke
    messages=[
        {
          "role": "user",
          "content": "What are small language models and how do they compare to LLMs?"
        }
    ]
)

print(response.choices[0].message.content)
  1. Run your Arcee AI powered Together.ai Completion

python arceeai_together.py

Last updated