What will it take to move the global economy from a higher-carbon to a lower-carbon state?
Humankind has been burning hydrocarbons on an industrial scale for almost three centuries. Today we consume billions of tons of oil, gas, and coal emitting nearly 40 billion tons of stored carbon as carbon dioxide each year in the process. Globally, we have made headway in blunting the emissions trajectories of some sectors like electric power (which is now 30% renewable) and electric vehicles. We are just beginning to make progress in many other sectors: heavy industrial processes, the built environment, heavy-duty transport, and food and agriculture.
Energy transition —changing our sources— will be a decades-long process. Decarbonization —changing our industrial processes and business models, and creating entire new industries in the process— will take decades as well. It will take hundreds of trillions of dollars. It will require a sustained, and sustainable, creation of assets and infrastructure at a scale not yet seen in even the most industrialized economies.
And for all of today’s successes at a global level, energy transition and decarbonization needs and processes are often very specific. Some economies have made no progress at all, and even established industries such as wind and solar power can suffocate under the weight of planning, permission, and long development timelines. Accelerating this progress everywhere is the biggest challenge of our lifetime. And, it deserves the biggest technology development of our current moment: artificial intelligence.
Artificial intelligence, and more specifically the large language models (LLMs) and generative AI which millions of people now use to interact with the world’s knowledge, to solve problems, and to produce new ideas, is a new general-purpose technology. That means it has certain attributes: it supports new industries, it is a low-cost input that changes the cost structures of existing economic activity, and it leads to the development of new major infrastructure.
In Halcyon’s world, large language models are a key element of processing a large and expanding body of energy transition information, much of it almost aggressively opaque even to experts. Proceedings and rulemakings and dockets of regulatory activity work according to their own logic, with their own style, and for their own purposes. Once we have taken this unstructured data and given it structure through indexing, classification and ranking, we use LLMs as a bridge between database and human language. Or to put it another way: we work to encode knowledge into software; LLMs allow us to ask it questions, and knowledge speaks back to us.
LLMs, in other words, make our work happen in a way that it could not before, and allow us to do something with software that would have been extremely expensive (or impossible) to do only with humans, no matter how expert. So, that means that for Halcyon, LLMs satisfy two of the general-purpose technology attributes: supporting new industries, and doing so with a changed cost structure.
But, there is a third attribute of general-purpose technologies that AI also meets, and it is one far bigger than us: the development of new major infrastructure. AI is certainly doing that, and at a global scale. AI requires massive amounts of specialized compute capacity, which in turn requires data centers to house that capacity and electricity to power it. Nvidia gives us a glimpse of the growth to come; its data center revenue tripled in the last fiscal year to more than $47 billion.
We can also see it in the capital expenditure projections of Google and Microsoft, which each approach $50 billion this year. We can see it too in expectations of major US power utilities such as Georgia Power, which has seen 25 new requests for interconnecting industrial electricity customers in the past year – 8 of which are for data centers.
That level of energy-intensive capex can be found elsewhere in global business. But, it’s on the supply side of energy, not the demand side. $50 billion a year is what Saudi Aramco, the world’s largest oil producer, invested in energy supply last year, not in energy demand.
Generative AI’s energy requirements, fortunately, are primarily electrical, and its biggest proponents run as clean as possible with multi-year, multi-gigawatt, multi-billion dollar developments of renewable energy supply agreements. Supplying data center hyperscalers (and other energy-intensive industries) with massive new clean power resources, and doing so as quickly and efficiently as possible, is imperative.
And that is where AI can help itself where it helps us all decarbonize faster. A faster, more efficient, and more intuitive engagement with the vast amount of data needed for planning and permitting new energy assets can and should serve global decarbonization needs. That includes the needs of the companies that provide this new general-purpose technology themselves.
Sign up for our newsletter to get all this awesome content in your inbox!