Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

Many in business think of the winners in the AI model market has already been chosen: Big Tech will have it (Google, Meta, Microsoft, Amazon to a lesser extent) along with those who choose, especially OpenAI and Anthropic.
But the little 30-person developer Arcee AI disagrees. The company has just released a multi-purpose and open source (Apache license), base called Trinity, and Arcee says that at 400B units, it is one of the largest base models ever developed and released by a US company.
Arcee says the Trinity compares to Meta’s Llama 4 Maverick 400B, and the Z.ai GLM-4.5, an open-source model from China’s Tsinghua University, according to benchmark tests conducted using basic (slightly trained) samples.

Like other modern versions (SOTA), Trinity is designed to have many documents and methods as assistants. However, despite its size, it is not a true SOTA competitor as it currently only supports text.
A lot of models are working – the vision model is growing, and voice-to-voice is on the way, CTO Lucas Atkins told TechCrunch (pictured above, left). In comparison, Meta’s Llama 4 Maverick already has a lot of features, supporting text and graphics.
But before adding more types of AI to its list, Arcee says, it wanted a basic LLM that would appeal to its target customers: developers and students. The group wants to persuade US companies of all sizes not to choose China’s open brands.
“At the end of the day, the winners in this game, and the only way to win in the application, is to have the best open source model,” Atkins said. “To win the hearts and minds of designers, you have to give them the best.”
Techcrunch event
San Francisco
| |
October 13-15, 2026
The signs show that the model of the Trinity, which is shown when more studies are carried out, holds itself and, in some cases, gives the Llama a little in the tests of writing and mathematics, wisdom, knowledge and reasoning.
The progress Arcee has made so far as an AI Lab competitor is impressive. A great example of the Trinity follows two small previous examples which was released in December: the 26B-parameter Trinity Mini, a post-training thinking system for applications ranging from web applications to agents, and the 6B-parameter Trinity Nano, an experimental model designed to push the limits of models that are small but interactive.
The shooter is, Arcee trained them all in six months for $20 million, using 2,048 Nvidia Blackwell B300 GPUs. This is out of about $50 million that the company has raised so far, said founder and CEO Mark McQuade (pictured above, right).
That kind of money was “too much for us,” said Atkins, who led the construction effort. However, he admitted that it is low compared to the number of laboratories currently using it.
The long six months “counted a lot,” said Atkins, whose work before the LLM involved automotive audio engineers. “We are a very hungry young startup. We have a lot of talent and smart young researchers who, given the opportunity to spend so much money and train a model of this size, we believed they could do something. And they did, because of sleepless nights, long hours.”
McQuade, formerly of HuggingFace’s marketing department, says Arcee didn’t set out to become the new US AI Lab: The company was initially customizing for big customers like SK Telecom.
“We were just doing the training after the end. So we take the main work of others: We will take the example of Llama, we take the example of Mistral, we take the example of Qwen which was open source, and we trained it to be better” to use the company, he said, including doing motivational training.
But as their client list grew, Atkins said, the importance of their brand became important, and McQuade was worried about relying on other companies. At the same time, the best open source models came from China, which US businesses were tired of, or banned from using.
It was mind boggling. “I think there are less than 20 companies in the world that have ever trained and released their brand” at the size and scale that Arcee shot, McQuade said.
The company started small at first, trying its hand at a small, 4.5B model developed in partnership with training company DatologyAI. The success of this project inspired greater efforts.
But if the US already has Llama, why does it need another open source? Atkins says that by choosing Apache’s open source license, the startup is committed to always keeping its versions open source. This comes after Meta CEO Mark Zuckerberg last year He showed that his company could not always be there making all of his most advanced models open source.
“Llama can be considered open source because it uses a Meta-controlled license with commercial and usage warnings,” he says. This has resulted other organizations are open to comment that Llama is not at all compatible with open source.
“Arcee exists because the US needs an open, Apache-approved, border system that can compete on today’s borders,” McQuade said.
All versions of the Trinity, large and small, can be downloaded for free. The larger version will be released in three colors. Trinity Large Preview is a semi-trained training system, meaning that it is trained to follow human instructions, not just predict the next word, which helps it to use chat. Trinity Large Base is the basic model without further training.
Then we have TrueBase, a model with any information or training post so that businesses or researchers who want to customize do not need to extract data, rules or concepts.
Acree AI will eventually offer a released version of its own version, it says, at competitive API prices. This release is six weeks away as the startup continues to advance the concept studies of the brand.
API prices for Trinity-Mini are $0.045 / $0.15, and there is also an unlimited free version available. Currently, the company still sells training and customization options.