LogoLogo
  • 👋Welcome to Arcee AI Docs
  • Arcee Orchestra
    • Introduction to Arcee Orchestra
    • Getting Started
    • Workflows
      • Workflow Components
        • Model Node
        • Code Node
        • Integrations
        • Knowledge Retrieval
        • Conditional Node
      • Passing Variables
      • API Invocation
        • List Available Workflows API
        • Workflow Execution API
        • Workflow Execution Steps API
        • Execution History API
        • Workflow Diagram API
        • API Code Examples
        • Upload Workflow JSON API
        • Workflow Runs API
    • Workflow Library
      • Research Automation
      • Real Time Financial Analysis
      • Blog Writer
      • Code Improvement
      • Energy Domain Assistant
    • Chat Interface
    • FAQ
  • ARCEE CONDUCTOR
    • Introduction to Arcee Conductor
    • Getting Started
    • Features & Functionality
      • Auto Mode
      • Auto Reasoning Mode
      • Auto Tools Mode
      • Compare
      • Direct Model Invocation
      • Usage
      • API
    • Arcee Small Language Models
      • Model Selection
      • Model Performance
    • Pricing
Powered by GitBook
On this page
  1. ARCEE CONDUCTOR
  2. Features & Functionality

Direct Model Invocation

PreviousCompareNextUsage

Last updated 27 days ago

In addition to utilizing auto to route between models, you can directly invoke Arcee Small Language Models and 3rd party LLMs (Claude, GPT, etc) using the Conductor API. This makes it easy to use a large variety of models with only changing a parameter in the API request.

If you're on the free plan and don't have a valid payment method on file, you will only be able to directly invoke Arcee SLMs. To upgrade your plan reach out to .

The models which can be directly invoked are:

Model
API Name
Description

Virtuoso Small

virtuoso-small

A 14B parameter SLM from Arcee AI, distilled from Deepseek V-3. Virtuoso Small is extremely performant and excels at simple tasks such as text generation and summarization.

Blitz

blitz

A 24B parameter SLM from Arcee AI, distilled from Deepseek V-3 Blitz offers blazing fast response times and exceptionally low costs with strong general knowledge. Ideal for simple and creative tasks.

Virtuoso Medium

virtuoso-medium

A 32B parameter SLM from Arcee AI, which was distilled from Deepseek V-3 giving it an impressive knowledge distribution.

Virtuoso Large

virtuoso-large

Arcee AI's premier 72B parameter SLM which competes with the leading LLMs on complex and analytical tasks.

Coder Large

coder

A 32B parameter SLM from Arcee AI fine-tuned to excel at Coding tasks.

Caller Large

caller-large

A 32B parameter SLM from Arcee AI fine-tuned to excel at function calling.

GPT-4.1

gpt-4.1

A closed-source LLM from Open AI with impressive analytical and complex problem solving capabilities.

Claude Sonnet 3.7

claude-3-7-sonnet-20250219

A closed-source LLM from Anthropic with strong performance on coding and complex tasks.

API Usage Example:

curl -X POST https://conductor.arcee.ai/v1/chat/completions \
  -H "Authorization: Bearer $ARCEE_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
        "stream": true,
        "model": "blitz",
        "messages": [
          {
            "role": "user",
            "content": "Your prompt here"
          }
        ]
      }'
# First, install the openai packages
# pip install openai

# Be sure to set the following environment variables
# OPENAI_BASE_URL="https://conductor.arcee.ai/v1"
# OPENAI_API_KEY="$ARCEE_TOKEN"

from openai import OpenAI

client = OpenAI()
stream = client.chat.completions.create(
  model='blitz',
  messages=[{'role': 'user', 'content': 'Your prompt here'}],
  temperature=0.4,
  stream=True
)

for chunk in stream:
    if len(chunk.choices) > 0 and chunk.choices[0].delta.content is not None:
        print(chunk.choices[0].delta.content, end="")

To switch the model you're invoking, simply change the "model" value from "blitz" to any of the model API names in the table above.

conductor@arcee.ai
Page cover image