Page cover

Legacy Arcee Models

Before training our own models from scratch, Arcee AI was founded as a model post-training company. During this time, we focused on fine-tuning task specific models, taking open source models and enhancing them using training techniques pioneered by our research team.

This page serves as a repository of our previously released, and now retired, fine-tuned models.

General Purpose

Reasoning

Coding

Function Calling

Arcee Blitz

  • Description: Arcee-Blitz (24B) is a new Mistral-based 24B model distilled from DeepSeek, designed to be both fast and efficient. We view it as a practical “workhorse” model that can tackle a range of tasks without the overhead of larger architectures.

    • #Parameters: 24B

    • Base Model: Mistral-Small-24B-Instruct-2501

    • Open-source and available on Hugging Face under the Apache-2.0 license: arcee-ai/Arcee-Blitz

  • Top Use Cases:

    • General-purpose task handling

    • Business communication

    • Automated document processing for mid-scale applications

Last updated