Home » Business » Global AI computing will use ‘multiple NYCs’ worth of power by 2026, says founder

Global AI computing will use ‘multiple NYCs’ worth of power by 2026, says founder

AI’s Power Hunger: Chip Startup Predicts Data Center Boom

The future of artificial intelligence (AI) hinges on massive computing power. And while the open-source community buzzes about the latest large language models (LLMs), one startup says the real bottleneck isn’t just training these AI behemoths – it’s deploying them in the real world.

Thomas Graham, co-founder of optical computing startup Lightmatter, believes the next stage in AI’s evolution will see a dramatic increase in the need for powerful data centers capable of handling these deployed models.

“Think of training as research and development, and deploying these models as the actual product launch,” Graham explained at a recent Bloomberg Intelligence conference. “To run these large models effectively, you’re going to need massive computational resources.”

This prediction comes at a time when tech giants are constantly building larger and more powerful data centers to accommodate the compute-intensive demands of AI. Nvidia, a leading force in the AI chip market, has repeatedly emphasized the need for increasingly sophisticated "inference" capabilities – the process of running already-trained AI models.

Lightmatter is developing a potentially game-changing technology: an optical chip that can link massive numbers of processors together through high-speed fiber optics. This approach offers significant speed and energy efficiency advantages over traditional copper-based connections.

"Imagine replacing all the copper wiring in a data center with fiber optics. It dramatically increases the bandwidth and makes the entire system much more efficient," Graham explained.

Lightmatter is working with prominent tech firms on new data center designs, including partnerships with semiconductor foundries. While specifics are under wraps, the implications are clear: Lightmatter’s technology could be integrated into custom chips designed by the likes of Google, Amazon, and Microsoft for their massive AI data centers.

The scale of this future infrastructure is staggering. Graham predicts that by 2026, the world will require 40 gigawatts of power just to run these AI data centers – the equivalent of eight New Yorks!

For Lightmatter, the potential rewards are vast. The company recently secured a $400 million investment, valuing it at a hefty $4.4 billion.

But Graham recognises that the AI landscape can shift rapidly. "If researchers discover a dramatically more efficient AI algorithm that performs better and reaches artificial general intelligence (AGI) faster," Graham concedes, "that could alter the course of data center development entirely."

Until then, however, the race is on to build the infrastructure capable of powering the next generation of AI, a race Lightmatter is determined to win.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.