Here at Halcyon we're attuned to how both the volume and frequency of new energy information publishing demand an alerts and notifications system. After Halcyon’s platform ingests documents from federal, state, and regional entities, we can send alerts about new filings from specific entities (like the California Public Utilities Commission or the Federal Energy Regulatory Commission) and on new specific types of documents (such as proposed regulatory decisions, or the signing of a power plant interconnection service agreement).
There is a healthy debate about the impacts of artificial intelligence on infrastructure, the global power grid, and business productivity. We at Halcyon welcome the discussion, which is earnest, data-driven, thoughtful and constructive. This week, I thought it worth reviewing the state of AI challenges, its impact on productivity, and why we need to remember that building AI infrastructure is not the same as what we build on top of it.
If I look back to what I wrote two months ago, I see little change to our core thesis that AI will play a role in significantly accelerating the climate infrastructure that we need at global scale. But I also see much more healthy, thoughtful questioning about just how much of an impact AI will have economy-wide, and if the investment it requires today will pay off in the future.
Sequoia Capital published its latest analysis last month, calling the artificial intelligence industry’s future revenue a $600 billion question. David Cahn, Partner at Sequoia, uses a handy formula to arrive at this question: Nvidia’s annual run rate revenue from data center products ($150 billion), doubled to reflect the total cost of data centers being built using its hardware ($300 billion), doubled again to reflect a 50% gross margin for the end user of Nvidia’s graphics processing units, be they Amazon Web Services or a small-team startup ($600 billion).
Also last month, Goldman Sachs estimated that investment in developing and running AI technology will reach approximately a trillion dollars. Its global head of equity research argues that for AI to earn an adequate return on that investment, “it must be able to solve complex problems, which it isn’t built to do.” AI is not the internet, which is a “truly life-changing invention” that “enabled low-cost solutions to disrupt high-cost solutions even in its infancy.”
Both Goldman and Sequoia see challenges ahead for AI in multiple dimensions. Sequoia says that AI data centers will lack pricing power, and depreciate rapidly (as all hardware will). Goldman’s work also includes an interview with MIT professor Daron Acemoglu, an expert on work and productivity, who is skeptical that AI is a category creator that will bring forth new tasks and products. Goldman also spoke with Brian Janous of Cloverleaf Infrastructure, who highlights the challenges in building clean power infrastructure with enough speed and enough scale to keep AI power needs from ballooning into a carbon bubble.
Indeed, both Google and Microsoft have seen their emissions reductions plans dashed, at least for now, by the reality of soaring power demand as their hyperscale activities continue hyperscaling. Google’s emissions are up 48% since 2019; and Microsoft’s are up 30% since 2020. Both companies are world leaders in procuring zero-emissions power for their operations – but at the moment, their efforts are not keeping pace.
We are hopeful that we can solve many of AI’s infrastructure challenges through a strong and coordinated policy effort, and innovation within AI itself, in particular to reduce power demand while still delivering superior compute.
In terms of the broadest business impacts, looking back at what’s already happened is more instructive than prognosticating what might. Exactly 37 years ago, in a book review discussing the US industrial economy, the economist Robert Solow said that “you can see the computer age everywhere but in the productivity statistics.” He elaborates: the authors of the book he reviewed “are somewhat embarrassed by the fact that what everyone feels to have been a technological revolution, a drastic change in our productive lives” has corresponded with slowing productivity growth, not its acceleration.
Solow’s argument is mathematical, and therefore inarguable in the present moment. But, with 37 years of hindsight, we can ask different questions of his time, and think about analogous questions today. When he wrote, computers were still largely business tools, and there were fewer than 50 million personal computers in existence. The revolutions to come, via the consumer internet, could not have existed yet with such a small base, nor could meaningful network effects come into effect either. (Those of a certain age will remember McKinsey’s infamous mobile phone market estimation of 900,000 total subscribers.)
The revolution from computing was not just about faster productivity too - it was about doing more, including creating a $5.2 trillion information technology industry onto which many other businesses could build. AI could well be the same - a deep layer, revolutionary to some but invisible to others, that changes how we build, and eventually what we build.
And as for over-building? If today’s AI infrastructure ends up over-engineered, that is a cost to the infrastructure builder, and a benefit to those who build on top of it. Sequoia’s Cahn says: “Founders and company builders will continue to build in AI - and they will be more likely to succeed, because they will benefit both from lower costs and from learnings accrued during this period of experimentation.”
We may well end up with a period of overbuilt AI infrastructure. But if that occurs, it does not mean a period of overbuilding with AI. In fact, it will be the opposite: building more with less, and for more purposes too.
Comments or questions? We’d love to hear from you - sayhi@halcyon.eco, or find us on LinkedIn and Twitter.