About 3 results
Open links in new tab
  1. Arcee's new, open source Trinity-Large-Thinking is the rare ...

    2 days ago · One answer: Arcee, a San Francisco based lab, which this week released AI Trinity-Large-Thinking —a 399-billion parameter text-only reasoning model released under the uncompromisingly …

  2. Arcee AI | Trinity-Large-Thinking: Scaling an Open Source ...

    5 days ago · On many axes, it is the strongest open model ever released outside of China. It is the result of the last two months spent improving and scaling our SFT and RL pipeline so it could meet the size …

  3. Arcee AI Releases Trinity Large Thinking: An Apache 2.0 Open ...

    4 days ago · Technically, it is a sparse Mixture-of-Experts (MoE) model with 400 billion total parameters. However, its architecture is designed for inference efficiency; it activates only 13 billion parameters …

  4. Arcee AI releases Trinity-Large-Thinking, a 399B-parameter ...

    3 days ago · Arcee AI releases Trinity-Large-Thinking, a 399B-parameter text-only reasoning model under an Apache 2.0 license, allowing full customization and commercial use — The baton of open …

  5. Arcee AI releases Trinity-Large-Thinking, a 399B-parameter ...

    3 days ago · Arcee AI releases Trinity-Large-Thinking, a 399B-parameter MoE AI model under an Apache 2.0 license, allowing full customization and commercial use — The baton of open source AI …

  6. Arcee aims to reboot U.S. open source AI with new Trinity ...

    Dec 2, 2025 · Today, Arcee AI announced the release of Trinity Mini and Trinity Nano Preview, the first two models in its new “Trinity” family—an open-weight MoE model suite fully trained in the United …

  7. Trinity-Large-Thinking | Arcee AI Documentation

    5 days ago · Trinity-Large-Thinking is a reasoning-optimized variant of Arcee AI's Trinity-Large family — a 398B-parameter sparse Mixture-of-Experts (MoE) model with approximately 13B active parameters …