Arcee Small Language Models
Last updated
Last updated
Arcee AI started as a model training company, assembling a world class team of researchers who pioneered model post-training techniques and open source libraries such as for efficient model training, for Model Merging, and for Model Distillation. We have taken these research advancements and applied the techniques to leading open source models to create state-of-the-art small language models (SLMs).
Arcee defines SLMs as a model which can be run efficiently on a single GPU instance, allowing for deployment of the models in your own environment. Our models range from 150M to 72B parameters and are optimized for cost efficiency and performance while maintaining high levels of accuracy.
Arcee SLMs are fine-tuned to be task specific, offering general purpose, reasoning, coding, function calling, and vision models. Below is a high level overview of our models:
General Purpose
Arcee Blitz
Virtuoso Small
Virtuoso Medium
Virtuoso Large
Reasoning
Maestro
Coding
Coder Large
Function Calling
Caller Large
See for more detailed information on each model.