Mixture-of-Experts AI News & Updates

Reflection AI Raises $2B to Build Open-Source Frontier Models as U.S. Answer to DeepSeek

Reflection, founded by former Google DeepMind researchers, raised $2 billion at an $8 billion valuation to build open-source frontier AI models as an American alternative to Chinese labs like DeepSeek. The startup, backed by major investors including Nvidia and Sequoia, plans to release a frontier language model next year trained on tens of trillions of tokens using Mixture-of-Experts architecture. The company aims to serve enterprises and governments seeking sovereign AI solutions while releasing model weights publicly but keeping training infrastructure proprietary.

DeepSeek Updates Prover V2 for Advanced Mathematical Reasoning

Chinese AI lab DeepSeek has released an upgraded version of its mathematics-focused AI model Prover V2, built on their V3 model with 671 billion parameters using a mixture-of-experts architecture. The company, which previously made Prover available for formal theorem proving and mathematical reasoning, is reportedly considering raising outside funding for the first time while continuing to update its model lineup.

Alibaba Launches Qwen3 Models with Advanced Reasoning Capabilities

Alibaba has released Qwen3, a family of AI models with sizes ranging from 0.6 billion to 235 billion parameters, claiming performance competitive with top models from Google and OpenAI. The models feature hybrid reasoning capabilities, supporting 119 languages and using a mixture of experts (MoE) architecture for computational efficiency.

Meta Launches Advanced Llama 4 AI Models with Multimodal Capabilities and Trillion-Parameter Variant

Meta has released its new Llama 4 family of AI models, including Scout, Maverick, and the unreleased Behemoth, featuring multimodal capabilities and more efficient mixture-of-experts architecture. The models boast improvements in reasoning, coding, and document processing with expanded context windows, while Meta has also adjusted them to refuse fewer controversial questions and achieve better political balance.