• Home  
  • OpenAI, Oracle to power Stargate site with Nvidia AI chips
- featured

OpenAI, Oracle to power Stargate site with Nvidia AI chips

OpenAI and Oracle are preparing to equip a massive new data center in Texas with tens of thousands of Nvidia AI chips in the coming months. This facility is the first step in their $100 billion Stargate project, aimed at expanding AI infrastructure on a large scale, according to Bloomberg. The data center in Abilene, […]

OpenAI introduces o3 AI models with enhanced safety features

OpenAI and Oracle are preparing to equip a massive new data center in Texas with tens of thousands of Nvidia AI chips in the coming months.

This facility is the first step in their $100 billion Stargate project, aimed at expanding AI infrastructure on a large scale, according to Bloomberg.

The data center in Abilene, Texas, is set to house 64,000 Nvidia GB200 chips by the end of 2026, according to a source familiar with the plans.

The rollout will happen in phases, starting with 16,000 chips expected to be deployed by this summer.

The planned shipments highlight the immense computing power dedicated to just the early phases of a single data center.

This also underscores the scale of the Stargate venture, announced by OpenAI, SoftBank, and Oracle at a White House event in January.

OpenAI has stated that Stargate could expand to up to ten locations.

An OpenAI spokesperson confirmed that the company is collaborating with Oracle on the design and delivery of the Abilene data center, with Oracle responsible for acquiring and operating the supercomputer being built there.

Oracle did not respond to requests for comment, and Nvidia declined to comment.

Stargate is part of a growing competition among tech giants to secure Nvidia’s latest chips, essential for training and deploying generative AI models.

Recently, Elon Musk’s xAI signed a $5 billion deal with Dell Technologies for AI servers to power a supercomputer in Memphis.

Meta said it plans to have computing power equivalent to 600,000 Nvidia H100 chips by the end of 2024.

Meanwhile, AI cloud provider CoreWeave reported having over 250,000 Nvidia GPUs across 32 data centers, according to its recent public offering filings.

Nvidia has not disclosed the price of the GB200, but CEO Jensen Huang previously stated that the less powerful B200 chip costs between $30,000 and $40,000 each.