The commencement of Google Cloud Next 2024 marks the beginning of a series of significant announcements from the company, chief among them being the unveiling of its new Axion processor.
Representing Google’s inaugural venture into Arm-based CPUs tailored specifically for data centers, Axion was meticulously crafted utilizing Arm’s Neoverse V2 CPU architecture.
According to Google, Axion demonstrates a remarkable 30% improvement over its fastest general-purpose Arm-based counterparts in cloud environments, and a striking 50% enhancement compared to the latest x86-based VMs of similar specifications.
Additionally, Google asserts that Axion boasts a noteworthy 60% increase in energy efficiency compared to equivalent x86-based VMs. The utilization of Axion is already underway in services such as BigTable and Google Earth Engine, with plans for further integration into additional services in the foreseeable future.
The introduction of Axion places Google in direct competition with Amazon, which has long been at the forefront of Arm-based CPU development for data centers. Amazon Web Services initiated this trend with the release of the Graviton processor in 2018, subsequently unveiling second and third iterations over the ensuing years.
Concurrently, other players in the chip development arena, such as NVIDIA with its Grace Arm-based CPU for data centers launched in 2021, and companies like Ampere, have been steadily making strides in this domain.
Google’s foray into processor development is not entirely new, having previously focused primarily on consumer products. The original Arm-based Tensor chip debuted in the Pixel 6 and 6 Pro smartphones in late 2021, followed by updated iterations powering subsequent Pixel phone releases.
Prior to this, Google had developed the “Tensor Processing Unit” specifically for its data centers, internally deploying them as early as 2015 before publicly announcing them in 2016 and making them available to third parties in 2018.
Arm-based processors are often lauded for their cost-effectiveness and superior energy efficiency. Interestingly, Google’s announcement coincided with a cautionary statement from Arm’s CEO, Rene Haas, regarding the energy consumption of AI models.
Haas highlighted the voracious appetite for electricity exhibited by models such as ChatGPT, emphasizing the urgent need for greater efficiency to sustain the pace of technological advancements.
He warned that by the end of the decade, AI data centers could potentially consume as much as 20% to 25% of US power requirements, a significant increase from the current estimated usage of four percent or less, which he deemed unsustainable.