Qualcomm Enters AI Data Center Race with New AI200 and AI250 Chips

Qualcomm has officially stepped into the high-stakes world of (artificial intelligence) AI data centers, unveiling its new line of AI accelerator chips—the AI200 and AI250.

Qualcomm Enters AI Data Center Race with New AI200 and AI250 Chips

Qualcomm has officially stepped into the high-stakes world of (artificial intelligence) AI data centers, unveiling its new line of AI accelerator chips—the AI200 and AI250—and accompanying rack-scale servers. The move signals a major push by the company to challenge the dominance of Nvidia and AMD, whose chips currently power most AI-driven computing infrastructure worldwide. Following the announcement, Qualcomm’s shares surged by more than 15%, underscoring the excitement among investors as the company diversifies beyond its hallmark mobile and wireless semiconductor business.

From Mobile Processors to AI Powerhouses

Historically recognized for smartphone chipsets, Qualcomm is retooling its expertise for the data center era. Both the AI200, scheduled for release in 2026, and the AI250, expected in 2027, draw heavily from the company’s proprietary Hexagon neural processing unit (NPU) technology—a key component in its mobile processors. A third-generation chip is already in the pipeline for 2028, with Qualcomm planning to maintain an annual release cadence.

Each AI chip can be deployed as part of a full, liquid-cooled server rack or sold as standalone hardware. The company’s full rack systems can house multiple accelerators working together, comparable to Nvidia’s massive GPU clusters that power advanced AI models like OpenAI’s GPT. Qualcomm’s design allows for scalable deployment and flexibility; customers can purchase the full stack, select components, or integrate Qualcomm chips into their own custom-built systems.

Focus on Inference Efficiency and Cost Reduction

Qualcomm’s AI200 and AI250 are optimized for inference—the process of running AI models—rather than training them. This strategic focus caters to cloud providers seeking efficient, cost-effective hardware for real-world AI tasks. The company claims its new systems offer advantages in both power consumption and total cost of ownership, crucial factors for data centers grappling with spiraling energy costs.

The AI200 rack reportedly operates using around 160 kilowatts of power—similar to high-end Nvidia GPU setups—but promises better energy efficiency. The Company also highlighted innovative memory handling as a differentiator, with its AI cards supporting up to 768 gigabytes of memory. The AI250 will raise the bar further, featuring ten times the memory bandwidth of the AI200 to meet the growing demand for faster AI model execution.

Durga Malladi, Qualcomm’s general manager for data center and edge solutions, noted that the company’s modular design approach means even competitors like Nvidia and AMD could become customers for specific components, such as Qualcomm CPUs. “We aim to put our customers in a position where they can take the whole system or mix and match as they please,” Malladi explained.

Qualcomm Enters AI Data Center Race with New AI200 and AI250 Chips
Qualcomm Enters AI Data Center Race with New AI200 and AI250 Chips

Competing in a Trillion-Dollar Market

The AI data center market represents one of the fastest-growing sectors in technology, with an estimated $6.7 trillion in global data center investments projected by 2030. Currently, Nvidia controls more than 90% of the AI semiconductor market, driven largely by the explosive demand for generative AI tools. AMD has emerged as Nvidia’s main challenger, recently striking deals with companies like OpenAI to supply GPUs.

Qualcomm’s entry adds a formidable new player to the mix. Its AI200 and AI250 servers aim to court hyperscalers such as Amazon, Google, and Microsoft—companies that are both major AI hardware consumers and developers of their own chips. The competition will be steep, but Qualcomm believes its track record in chip innovation and efficiency can carve out a profitable niche.

Strategic Shift and Future Outlook

This new venture also marks a strategic transformation for Qualcomm, as it looks to reduce dependence on its smartphone business, which accounted for roughly $6.3 billion of its $10.4 billion third-quarter revenue. Entering the AI infrastructure segment could open fresh revenue channels and reduce volatility tied to mobile markets.

Qualcomm’s re-entry into the data center space comes after an earlier, short-lived attempt in 2017 with its Centriq platform, which struggled against Intel and AMD at the time. The company has since refined its approach with products like the AI 100 Ultra card and partnerships such as its collaboration with Saudi Arabia’s Humain, which is set to deploy up to 200 megawatts worth of Qualcomm’s AI inferencing systems.

If Qualcomm delivers on its performance and efficiency claims, the AI200 and AI250 could help redefine the balance of power in an industry currently centered on Nvidia and AMD. For now, the world’s major cloud providers and AI developers will be watching closely as the company’s data center ambitions unfold.

Previous Article

My Hero Academia: Class 1-A Hero Rankings Explained

Next Article

Bonds of Hercules: By Jasmine Mas (Book Review)