April 24, 2026 News
Google Commits Up to $40B to Anthropic Amid Escalating AI Compute Race
Google plans to invest up to $40 billion in Anthropic, with $10 billion committed immediately at a $350 billion valuation and $30 billion contingent on performance targets. The investment includes providing 5 gigawatts of computing capacity over five years, following Anthropic's release of its most powerful model, Mythos, which has significant cybersecurity applications but restricted access due to misuse concerns. This deal is part of an intensifying competition for AI compute resources, with Anthropic securing multiple infrastructure partnerships including additional investments from Amazon totaling up to $100 billion in compute capacity.
Skynet Chance (+0.04%): The release of Mythos with significant cybersecurity applications and acknowledged misuse potential that has already been compromised suggests advancement in dual-use AI capabilities. The massive compute investments ($40B from Google, $100B total with Amazon) enable scaling of potentially dangerous models faster than safety mechanisms can be developed.
Skynet Date (-1 days): The unprecedented scale of compute commitments (over 8.5 gigawatts combined from Google and Amazon deals) dramatically accelerates the timeline for training and deploying frontier models. This infrastructure race suggests dangerous capabilities could emerge sooner than previously anticipated, as compute bottlenecks are rapidly being removed.
AGI Progress (+0.03%): Mythos represents Anthropic's most powerful model to date, indicating continued scaling success in AI capabilities. The massive compute investments ($40B from Google alone) signal confidence that scaling laws continue to yield improvements, providing infrastructure to pursue AGI-level capabilities.
AGI Date (-1 days): The combination of 8.5+ gigawatts of secured compute capacity and multi-year commitments removes major infrastructure constraints that previously limited AGI research timelines. These deals suggest leading AI labs expect to need—and can now access—the computational resources for AGI-scale training runs within the next 3-5 years.
DeepSeek Releases V4 Models With 1.6 Trillion Parameters, Approaching Frontier Performance at Lower Cost
Chinese AI lab DeepSeek has released preview versions of its V4 large language models, including V4 Pro with 1.6 trillion parameters, making it the largest open-weight model available. The models reportedly close the gap with leading frontier models on reasoning benchmarks while offering significantly lower pricing, though they trail state-of-the-art models by approximately 3-6 months in knowledge tests. The release comes amid U.S. accusations that China is stealing American AI intellectual property through proxy accounts.
Skynet Chance (+0.04%): The release of increasingly capable open-weight models with competitive performance reduces barriers to accessing advanced AI capabilities, potentially enabling more actors (including malicious ones) to deploy powerful AI systems without robust safety controls. The geopolitical tensions and accusations of IP theft suggest a competitive race that may prioritize capability advancement over safety alignment.
Skynet Date (-1 days): The rapid development cycle (closing a 3-6 month gap with frontier models) and significantly lower costs accelerate the diffusion of near-frontier AI capabilities globally. This democratization of powerful AI, while beneficial in some ways, speeds up the timeline for potential misuse or loss-of-control scenarios by expanding the number of entities with access to advanced models.
AGI Progress (+0.04%): The architectural improvements enabling a 1.6 trillion parameter model with efficient mixture-of-experts design and 1 million token context windows represent significant technical progress in scaling AI systems. Performance approaching frontier models on reasoning tasks and coding benchmarks demonstrates continued advancement toward more general capabilities, even if knowledge retention lags slightly.
AGI Date (-1 days): The accelerated pace of competitive releases, with open-weight models rapidly closing the gap to frontier systems within months rather than years, indicates faster overall progress toward AGI. The combination of massive scale, improved efficiency, and dramatically lower costs ($0.14 vs. much higher frontier pricing) suggests the field is advancing more quickly than previously expected, potentially shortening AGI timelines.
Meta Commits to Millions of Amazon's Graviton AI CPUs in Major Cloud Deal
Meta has signed a deal with AWS to use millions of Amazon's homegrown Graviton ARM-based CPUs for AI workloads, particularly for inference and AI agent tasks. This marks a shift from GPU-dominated training workloads to CPU-intensive inference needs driven by AI agents performing real-time reasoning and multi-step coordination. The deal redirects Meta's spending back to AWS from competitors like Google Cloud, while showcasing Amazon's custom chip strategy against Nvidia's competing ARM-based AI CPUs.
Skynet Chance (+0.01%): The deal accelerates deployment of AI agents at scale through specialized infrastructure, enabling more autonomous AI systems to handle complex multi-step tasks. However, these are CPU-based inference systems rather than fundamental capability breakthroughs, representing incremental scaling rather than architectural risks.
Skynet Date (+0 days): The availability of millions of specialized CPUs for AI inference removes infrastructure bottlenecks for deploying AI agents at scale, modestly accelerating the timeline for widespread autonomous AI deployment. This represents optimization of existing capabilities rather than fundamental acceleration.
AGI Progress (+0.01%): The shift toward specialized infrastructure for AI agents performing real-time reasoning and multi-step coordination demonstrates practical progress in making AI systems more autonomous and capable. The massive scale of deployment (millions of chips) indicates maturation of inference-stage AI capabilities beyond pure model training.
AGI Date (+0 days): Large-scale infrastructure investment specifically designed for AI agent workloads removes a key bottleneck in deploying more sophisticated AI systems, modestly accelerating the practical timeline toward AGI. The deal signals major tech companies are preparing infrastructure for next-generation autonomous AI at scale.