# Direct Model Invocation

In addition to utilizing `auto` to route between models, you can directly invoke Arcee Small Language Models and 3rd party LLMs (Claude, GPT, etc) using the Conductor API.  This makes it easy to use a large variety of models with only changing a parameter in the API request.

{% hint style="info" %}
If you're on the free plan and don't have a valid payment method on file, you will only be able to directly invoke Arcee SLMs.  To upgrade your plan reach out to <conductor@arcee.ai>.&#x20;
{% endhint %}

The models which can be directly invoked are:

<table><thead><tr><th width="119.7890625">Model</th><th width="159.9765625">API Name</th><th width="470.8046875">Description</th><th data-hidden></th></tr></thead><tbody><tr><td>Virtuoso Small</td><td>virtuoso-small</td><td>A 14B parameter SLM from Arcee AI, distilled from Deepseek V-3. Virtuoso Small is extremely performant and excels at simple tasks such as text generation and summarization.</td><td></td></tr><tr><td>Blitz</td><td>blitz</td><td>A 24B parameter SLM from Arcee AI, distilled from Deepseek V-3 Blitz offers blazing fast response times and exceptionally low costs with strong general knowledge. Ideal for simple and creative tasks.</td><td></td></tr><tr><td>Virtuoso Medium</td><td>virtuoso-medium</td><td>A 32B parameter SLM from Arcee AI, which was distilled from Deepseek V-3 giving it an impressive knowledge distribution.</td><td></td></tr><tr><td>Virtuoso Large</td><td>virtuoso-large</td><td>Arcee AI's premier 72B parameter SLM which competes with the leading LLMs on complex and analytical tasks.</td><td></td></tr><tr><td>Coder Large</td><td>coder</td><td>A 32B parameter SLM from Arcee AI fine-tuned to excel at Coding tasks.</td><td></td></tr><tr><td>Caller Large</td><td>caller-large</td><td>A 32B parameter SLM from Arcee AI fine-tuned to excel at function calling.</td><td></td></tr><tr><td>GPT-4.1</td><td>gpt-4.1</td><td>A closed-source LLM from Open AI with impressive analytical and complex problem solving capabilities.</td><td></td></tr><tr><td>Claude Sonnet 3.7</td><td>claude-3-7-sonnet-20250219</td><td>A closed-source LLM from Anthropic with strong performance on coding and complex tasks.</td><td></td></tr></tbody></table>

#### API Usage Example:

{% tabs %}
{% tab title="CURL" %}

```curl
curl -X POST https://conductor.arcee.ai/v1/chat/completions \
  -H "Authorization: Bearer $ARCEE_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
        "stream": true,
        "model": "blitz",
        "messages": [
          {
            "role": "user",
            "content": "Your prompt here"
          }
        ]
      }'
```

{% endtab %}

{% tab title="Python" %}

```python
# First, install the openai packages
# pip install openai

# Be sure to set the following environment variables
# OPENAI_BASE_URL="https://conductor.arcee.ai/v1"
# OPENAI_API_KEY="$ARCEE_TOKEN"

from openai import OpenAI

client = OpenAI()
stream = client.chat.completions.create(
  model='blitz',
  messages=[{'role': 'user', 'content': 'Your prompt here'}],
  temperature=0.4,
  stream=True
)

for chunk in stream:
    if len(chunk.choices) > 0 and chunk.choices[0].delta.content is not None:
        print(chunk.choices[0].delta.content, end="")
```

{% endtab %}
{% endtabs %}

{% hint style="info" %}
To switch the model you're invoking, simply change the "model" value from "blitz" to any of the model API names in the table above.
{% endhint %}
