Microsoft is gearing up to tackle the future of AI by deploying interconnected datacenter hubs capable of supporting vast AI models. This ambitious project aims to efficiently train AI models requiring hundreds of trillions of parameters by linking remote facilities through ultra-fast networks over vast distances.

A milestone in this endeavor was achieved in October when Microsoft linked its datacenter campus in Mount Pleasant, Wisconsin, with a facility in Atlanta, Georgia. The end goal? To distribute AI workloads across multiple datacenters, utilizing technology commonly reserved for high-performance computing today.

These are not average datacenters. Dubbed ‘Fairwater’ clusters, the facilities boast advanced cooling systems and significant energy savings. In Atlanta, Microsoft plans to deploy Nvidia’s advanced computing systems to cater to diverse workloads with an impressive suite of GPUs.

Connecting these datacenters allows Microsoft to enhance AI training efficiency and optimize facility locations for cost, climate, and energy availability. While the bridging technology remains unspecified, speculation points towards advanced systems like Cisco’s routers or Nvidia’s network solutions.

This advancement underlines Microsoft’s commitment to seamless AI training across vast geographical distances, presenting a promising step towards highly integrated global AI infrastructure.