LogoLogo
  • 👋Welcome to Arcee AI Docs
  • Arcee Orchestra
    • Introduction to Arcee Orchestra
    • Getting Started
    • Workflows
      • Workflow Components
        • Model Node
        • Code Node
        • Integrations
        • Knowledge Retrieval
        • Conditional Node
      • Passing Variables
      • API Invocation
        • List Available Workflows API
        • Workflow Execution API
        • Workflow Execution Steps API
        • Execution History API
        • Workflow Diagram API
        • API Code Examples
        • Upload Workflow JSON API
        • Workflow Runs API
    • Workflow Library
      • Research Automation
      • Real Time Financial Analysis
      • Blog Writer
      • Code Improvement
      • Energy Domain Assistant
    • Chat Interface
    • FAQ
  • ARCEE CONDUCTOR
    • Introduction to Arcee Conductor
    • Getting Started
    • Features & Functionality
      • Auto Mode
      • Auto Reasoning Mode
      • Auto Tools Mode
      • Compare
      • Direct Model Invocation
      • Usage
      • API
    • Arcee Small Language Models
      • Model Selection
      • Model Performance
    • Pricing
Powered by GitBook
On this page
  1. ARCEE CONDUCTOR

Introduction to Arcee Conductor

PreviousFAQNextGetting Started

Last updated 24 days ago

Try Arcee Conductor today! Sign up at to receive $20 in credits and see how you can save up to 64% on your model costs.

Arcee Conductor is an intelligent routing and inference platform designed to optimize the use of language models by automatically routing prompts to the most appropriate and cost-effective option available. It intelligently selects from a diverse range of Arcee AI small language models (SLMs) and closed-source large language models (LLMs) from various providers, ensuring optimal performance and significant cost savings.

Try an interactive demo !

The Challenge of Model Selection:

  • Organizations often face the dilemma of choosing between cost-efficient smaller models that might lack advanced capabilities and powerful but expensive larger models.

  • Using only large LLMs for all tasks can lead to significant overspending, especially for simpler queries.

  • Conversely, relying solely on smaller models might compromise the quality of answers for complex prompts.

Introducing Arcee Conductor as the Solution:

  • Arcee Conductor addresses this challenge by automating the model selection process on a per-prompt basis.

  • It aims to provide the best, fastest, and cheapest model for each specific task.

  • This approach eliminates the need for manual decision-making and optimizes both performance and cost.

Key Benefits at a Glance:

  • Automatic Model Selection.

  • Significant Cost Savings.

  • Improved Latency by utilizing the most efficient model.

  • Access to a Wide Variety of Models.

  • Enhanced Performance by matching prompt complexity with model capabilities.

https://conductor.arcee.ai/
here
Introduction to Arcee Conductor
Page cover image