12.16.2025

Nvidia debuts Nemotron 3 with hybrid MoE and
Mamba-Transformer to drive efficient agentic AI

To build the Nemotron 3 models, Nvidia said it leaned into a hybrid mixture-of-experts (MoE) architecture to improve scalability and efficiency. By using this architecture, Nvidia said in a press release that its new models also offer enterprises more openness and performance when building multi-agent autonomous systems.