LogoLogo
  • 👋Welcome to Arcee AI Docs
  • Arcee Orchestra
    • Introduction to Arcee Orchestra
    • Getting Started
    • Workflows
      • Workflow Components
        • Model Node
        • Code Node
        • Integrations
        • Knowledge Retrieval
        • Conditional Node
      • Passing Variables
      • API Invocation
        • List Available Workflows API
        • Workflow Execution API
        • Workflow Execution Steps API
        • Execution History API
        • Workflow Diagram API
        • API Code Examples
        • Upload Workflow JSON API
        • Workflow Runs API
    • Workflow Library
      • Research Automation
      • Real Time Financial Analysis
      • Blog Writer
      • Code Improvement
      • Energy Domain Assistant
    • Chat Interface
    • FAQ
  • ARCEE CONDUCTOR
    • Introduction to Arcee Conductor
    • Getting Started
    • Features & Functionality
      • Auto Mode
      • Auto Reasoning Mode
      • Auto Tools Mode
      • Compare
      • Direct Model Invocation
      • Usage
      • API
    • Arcee Small Language Models
      • Model Selection
      • Model Performance
    • Pricing
Powered by GitBook
On this page
  • Models
  • Function Calling Example
  1. ARCEE CONDUCTOR
  2. Features & Functionality

Auto Tools Mode

PreviousAuto Reasoning ModeNextCompare

Last updated 8 days ago

Tool calling is one of the most important building blocks of building a successful agent system. Arcee conductor makes your workflows cheaper by calling appropriate tool calling models for you given the complexity of your input query.

Auto Tool mode provides a custom configuration of for models with function calling capabilities

auto-tool will take in your prompt and route it to the most appropriate function calling model based on complexity, task type, domain, and language.

For details on the router, see .

Models

Based on the classifications from the model router, the request is routed to one of the language models behind Arcee Conductor: auto-tool. The models which can currently be routed to include:

Model
Description

Arcee Caller Large

Arcee AI's 32B parameter function calling SLM optimized for managing complex tool-based interactions and API function calls.

GPT-4.1

A closed-source LLM from Open AI with function calling ability and impressive analytical and complex problem solving capabilities.

Claude Sonnet 3.7

A closed-source LLM from Anthropic with function calling ability and strong performance on coding and complex tasks.

Function Calling Example

This example first uses Arcee Conductor auto-tool to automatically select the most optimal function calling model based on the defined functions and the user prompt. Then Arcee Conductor auto is used to automatically select the most optimal general purpose model to answer the user's question using the output from the function call and the user's prompt.

import json
import os
import requests
from openai import OpenAI

endpoint = "https://conductor.arcee.ai/v1"
api_key = os.getenv("ARCEE_KEY")

client = OpenAI(
    base_url=endpoint,
    api_key=api_key,
)

def get_weather(latitude, longitude):
    response = requests.get(f"https://api.open-meteo.com/v1/forecast?latitude={latitude}&longitude={longitude}&current=temperature_2m,wind_speed_10m&hourly=temperature_2m,relative_humidity_2m,wind_speed_10m")
    data = response.json()
    return data['current']['temperature_2m']

tools = [{
    "type": "function",
    "name": "get_weather",
    "description": "Get current temperature for provided coordinates in celsius.",
    "parameters": {
        "type": "object",
        "properties": {
            "latitude": {"type": "number"},
            "longitude": {"type": "number"}
        },
        "required": ["latitude", "longitude"],
        "additionalProperties": False
    },
    "strict": True
}]

user_prompt = "What's the weather like in Paris today?"

tool_response = client.chat.completions.create(
    model="auto-tool",
    messages=[{"role": "user", "content": user_prompt}],
    tools=tools,
    tool_choice="auto",
    max_tokens=128,
)

tool_call = tool_response.choices[0].message.tool_calls[0]
args = json.loads(tool_call.function.arguments)
tool_result = get_weather(args["latitude"], args["longitude"])

tool_result = f"The current temperature is {tool_result}°C."

messages=[
    {
        "role": "system",
        "content": "You are a helpful and knowledgeable assistant giving sharp answers. Use a business-oriented tone."
    },
    {
        "role": "user",
        "content": f"""Answer the following question: {user_prompt} using the tool result: {tool_result}.
        If the tool result is empty or not useful, say it is not useful and answer the question without using the information.
        If the tool result is useful, you can complement it with your own knowledge as long as it's not contradictory.
        """
    }
]

answer_response = client.chat.completions.create(
    model="auto",
    messages=messages,
)
print(answer_response.choices[0].message.content)

auto
Auto Mode
Page cover image