This system was designed to address long-standing issues like power inflation, where progression becomes meaningless, and value deflation, where older content is rendered worthless. MSU supports a cyclical economy, where value can move in both directions, shaped by player behavior rather than forced scarcity.
Why MSU Isn’t Just Another Web3 Game
Most Web3 games start with a token and try to build a world around it. MSU starts with one of the most iconic gaming IPs of the past two decades. With over 280 million registered accounts, more than $5 billion in lifetime revenue, and millions of active users today, MapleStory already has what most Web3 projects are still trying to build.
We’re not trying to attract crypto-native speculators. We’re inviting real gamers into an ecosystem that feels familiar and rewarding. What they’ll find isn’t a marketplace first, but a game world where digital assets hold meaning, value, and permanence. The systems they interact with are fairer, extensible, and rooted in economic integrity.
This is the bridge between Web2 and Web3 gaming. MSU isn’t theorizing about it — we’re building it.
Design Is What Persists
The mistake wasn’t blockchain. It was the belief that technology alone could deliver fairness.
Resilient economies don’t depend on decentralization for its own sake. They depend on design. Scarcity should be deliberate. Value must circulate with purpose. And incentives should be rooted in what players do, not just what they hold.
At MSU, we didn’t set out to disrupt. We set out to last. The system isn’t designed to reward fast flips or early speculation. It’s designed to reflect effort, creativity, and time spent in the world we’re building.
This is the future of tokenomics: not a chase for volatility, but a search for meaning. Not decentralization as ideology, but design as principle.
Games First. Not Finance First.
MSU is not a play-to-earn game. We’re not trying to engineer short-term financial yield. We’re focused on preserving what has always made MapleStory compelling: long-term character progression, creative expression, and the emotional depth of shared worlds.
Blockchain doesn’t replace those experiences. It enhances them, making ownership real and persistence possible.
If Web3 gaming wants to move beyond buzzwords and into culture, it must put the game back at the center.
Web3 doesn’t need more whitepapers. It needs evidence.
We need proof that game economies can be sustainable. That blockchain can strengthen play, not distract from it. That participation can be meaningful without becoming purely financial.
MSU may not be the only model, but it is a functioning one. And that’s already more than most.
The silver lining isn’t just that Web3 gaming still has potential. It’s that we’re learning, through design, what it takes to turn that potential into reality.
Why MSU Isn’t Just Another Web3 Game
Most Web3 games start with a token and try to build a world around it. MSU starts with one of the most iconic gaming IPs of the past two decades. With over 280 million registered accounts, more than $5 billion in lifetime revenue, and millions of active users today, MapleStory already has what most Web3 projects are still trying to build.
We’re not trying to attract crypto-native speculators. We’re inviting real gamers into an ecosystem that feels familiar and rewarding. What they’ll find isn’t a marketplace first, but a game world where digital assets hold meaning, value, and permanence. The systems they interact with are fairer, extensible, and rooted in economic integrity.
This is the bridge between Web2 and Web3 gaming. MSU isn’t theorizing about it — we’re building it.
Design Is What Persists
The mistake wasn’t blockchain. It was the belief that technology alone could deliver fairness.
Resilient economies don’t depend on decentralization for its own sake. They depend on design. Scarcity should be deliberate. Value must circulate with purpose. And incentives should be rooted in what players do, not just what they hold.
At MSU, we didn’t set out to disrupt. We set out to last. The system isn’t designed to reward fast flips or early speculation. It’s designed to reflect effort, creativity, and time spent in the world we’re building.
This is the future of tokenomics: not a chase for volatility, but a search for meaning. Not decentralization as ideology, but design as principle.
Games First. Not Finance First.
MSU is not a play-to-earn game. We’re not trying to engineer short-term financial yield. We’re focused on preserving what has always made MapleStory compelling: long-term character progression, creative expression, and the emotional depth of shared worlds.
Blockchain doesn’t replace those experiences. It enhances them, making ownership real and persistence possible.
If Web3 gaming wants to move beyond buzzwords and into culture, it must put the game back at the center.
Web3 doesn’t need more whitepapers. It needs evidence.
We need proof that game economies can be sustainable. That blockchain can strengthen play, not distract from it. That participation can be meaningful without becoming purely financial.
MSU may not be the only model, but it is a functioning one. And that’s already more than most.
The silver lining isn’t just that Web3 gaming still has potential. It’s that we’re learning, through design, what it takes to turn that potential into reality.
[Leaders’ Note] A Silver Lining for Web3 Gaming: Where MapleStory Universe’s Tokenomics Could Take Us
MapleStory Universe
MapleStory Universe
Follow
5 min read
·
Apr 24, 2025
160
[Leaders’ Note] is a Medium corner where MSU leaders share their insights on the project and the broader industry. Explore and engage with their perspectives as they shape the future of gaming.
[Written by Keith Kim — Head of Strategy of MapleStory Universe, Nexpace]
The past two years have been turbulent for Web3 gaming. Initial optimism quickly gave way to disillusionment as many projects collapsed under the weight of unstable economies, premature launches, and unsustainable incentives. Today, the space is asking: what does a viable future look like?
At MapleStory Universe (MSU), we believe the answer lies in building real economies, not hype cycles. Blockchain isn’t a gimmick or an overlay. It’s a tool to re-architect value systems by aligning players, developers, and creators around sustainable participation rather than extraction.
Our tokenomics model was designed from the ground up to serve that purpose. Not just for MSU, but as a framework other projects might learn from as well.
A Legacy of Virtual Trade
For over two decades, MapleStory has supported one of gaming’s most active in-game economies. In 2023 alone, global peer-to-peer trade across its servers surpassed $1 billion USD, a figure few Web3 games could dream of.
This wasn’t just cosmetic trading. It reflected one of the most profitable free-to-play monetization models in gaming history. MapleStory built value around cosmetics, progression, and socially driven demand. It proved that digital items could hold meaning and long-term worth when deeply integrated into gameplay and culture.
Still, the limits of Web2 infrastructure were clear. Items could be issued in infinite supply, which forced developers to artificially introduce scarcity through sinks or transaction limits. High-value goods lost their worth as they changed hands, and control over value creation remained centralized with the publisher.
These weren’t failures in design, but constraints of the systems available at the time. Blockchain presents a chance to rethink those constraints by changing the foundation of digital economies.
Learning from Web3’s Early Missteps
The challenges Web3 gaming faced didn’t stem from a lack of ambition. They came from skipping the fundamentals. Many projects launched tokens before they had playable content. Economic incentives often encouraged extraction rather than engagement. When the speculative momentum faded, users left just as quickly.
Token and NFT supplies were rarely capped or balanced with meaningful sinks. In many cases, the assets had little or no gameplay relevance. What looked exciting on day one often turned out to be economically unsound.
MSU chose a different approach. We began with a game that already works, then crafted tokenomics to enhance the player experience rather than distort it.
A New Model at MapleStory Universe
MSU introduces two key mechanics, Fission and Fusion, that govern how in-game items are issued and reclaimed.
Fission is the process through which NFTs are created. Players use NXPC, our fixed-supply ecosystem token, to redeem items from predefined baskets. Each basket corresponds to a specific item type and includes a maximum supply. The NXPC required to redeem from a basket is based on the number of items in it multiplied by each item’s fixed redemption price.
New baskets can be added as the game expands and introduces new item types. Redemption conditions are governed by protocol logic, not market fluctuations or arbitrary developer tweaks.
Fusion allows players to return NFTs back into the system. Rather than burning assets individually, users contribute the same type of NFT into a shared basket. Once the basket is full, NXPC is proportionally redistributed to contributors. This helps anchor NFT values to NXPC utility and maintains consistency between what players spend and what they can recover.
MapleStory Universe
MapleStory Universe
Follow
5 min read
·
Apr 24, 2025
160
[Leaders’ Note] is a Medium corner where MSU leaders share their insights on the project and the broader industry. Explore and engage with their perspectives as they shape the future of gaming.
[Written by Keith Kim — Head of Strategy of MapleStory Universe, Nexpace]
The past two years have been turbulent for Web3 gaming. Initial optimism quickly gave way to disillusionment as many projects collapsed under the weight of unstable economies, premature launches, and unsustainable incentives. Today, the space is asking: what does a viable future look like?
At MapleStory Universe (MSU), we believe the answer lies in building real economies, not hype cycles. Blockchain isn’t a gimmick or an overlay. It’s a tool to re-architect value systems by aligning players, developers, and creators around sustainable participation rather than extraction.
Our tokenomics model was designed from the ground up to serve that purpose. Not just for MSU, but as a framework other projects might learn from as well.
A Legacy of Virtual Trade
For over two decades, MapleStory has supported one of gaming’s most active in-game economies. In 2023 alone, global peer-to-peer trade across its servers surpassed $1 billion USD, a figure few Web3 games could dream of.
This wasn’t just cosmetic trading. It reflected one of the most profitable free-to-play monetization models in gaming history. MapleStory built value around cosmetics, progression, and socially driven demand. It proved that digital items could hold meaning and long-term worth when deeply integrated into gameplay and culture.
Still, the limits of Web2 infrastructure were clear. Items could be issued in infinite supply, which forced developers to artificially introduce scarcity through sinks or transaction limits. High-value goods lost their worth as they changed hands, and control over value creation remained centralized with the publisher.
These weren’t failures in design, but constraints of the systems available at the time. Blockchain presents a chance to rethink those constraints by changing the foundation of digital economies.
Learning from Web3’s Early Missteps
The challenges Web3 gaming faced didn’t stem from a lack of ambition. They came from skipping the fundamentals. Many projects launched tokens before they had playable content. Economic incentives often encouraged extraction rather than engagement. When the speculative momentum faded, users left just as quickly.
Token and NFT supplies were rarely capped or balanced with meaningful sinks. In many cases, the assets had little or no gameplay relevance. What looked exciting on day one often turned out to be economically unsound.
MSU chose a different approach. We began with a game that already works, then crafted tokenomics to enhance the player experience rather than distort it.
A New Model at MapleStory Universe
MSU introduces two key mechanics, Fission and Fusion, that govern how in-game items are issued and reclaimed.
Fission is the process through which NFTs are created. Players use NXPC, our fixed-supply ecosystem token, to redeem items from predefined baskets. Each basket corresponds to a specific item type and includes a maximum supply. The NXPC required to redeem from a basket is based on the number of items in it multiplied by each item’s fixed redemption price.
New baskets can be added as the game expands and introduces new item types. Redemption conditions are governed by protocol logic, not market fluctuations or arbitrary developer tweaks.
Fusion allows players to return NFTs back into the system. Rather than burning assets individually, users contribute the same type of NFT into a shared basket. Once the basket is full, NXPC is proportionally redistributed to contributors. This helps anchor NFT values to NXPC utility and maintains consistency between what players spend and what they can recover.
How to write, according to the bestselling novelist of all time
Agatha Christie teaches a new writing course—with help from AI
May 8th 2025
EVERYONE HAS a book inside them, or so the saying goes. In this day and age, those who want help coaxing the story out can receive instruction online from some of the world’s most popular authors. Lee Child and Harlan Coben, who have sold hundreds of millions of books between them, teach thriller writing; Jojo Moyes offers tips on romance yarns. And now Agatha Christie, the world’s bestselling writer of fiction, with more than 2bn copies sold, is instructing viewers in the art of the whodunnit—even though she died in 1976.
Christie’s course is the result not of recently unearthed archival footage, but artificial intelligence. BBC Maestro, an online education platform, brought the idea to the Christie family, which still controls 36% of Agatha Christie Ltd (AMC Networks, an entertainment giant, owns the rest). They consented to bring the “Queen of Crime” back to life, to teach the mysterious flair of her style.
A team of almost 100—including Christie scholars as well as AI specialists—worked on the project. Vivien Keene, an actor, provided a stand-in for the author; Christie’s face was mapped on top. Crucially, Ms Keene’s eerily credible performance employs only Christie’s words: a tapestry of extracts from her own writings, notebooks and interviews.
In this way, the creator of Hercule Poirot and Miss Marple shares handy writing tips, such as the neatest ways to dispatch fictional victims. Firearms bring ballistic complications. Be wary of poisons, as each works in a unique way. Novice authors can “always rely on a dull blow to the head”.
Many of Christie’s writing rules concern playing fair. She practised misdirection and laid “false clues” alongside true ones, but insisted that her plots do not cheat or hide key evidence: “I never deceive my readers.” In sections devoted to plot and setting, she explains how to plant key clues “in plain sight” and plan events with detailed “maps and diagrams”. She advises viewers to watch and listen to strangers on buses or in shops and to spice up motives for murder with a love triangle.
Some of the most engaging sections come from “An Autobiography”, published posthumously in 1977: Poirot’s origins among the Belgian refugees who reached Devon during the first world war, or fond memories of her charismatic, feckless brother Monty, who had “broken the laws of a lot of countries” and provided the inspiration for many of Christie’s “wayward young male figures”.
By relying on Christie’s own words, BBC Maestro hopes to avoid charges of creepy pedagogical deepfakery. At the same time, it is that focus on quotation which limits the course’s value as a creative-writing toolbox. The woman born Agatha Miller in 1890 speaks from her own time and place. She tells wannabe writers to use snowstorms to isolate murder scenes (as they bring down telephone wires) and cites the clue-generating value of railway timetables, ink stains and cut-up newspapers. These charming details are irrelevant to modern scribblers.
Yet anachronism is not the course’s biggest flaw: it is that it lacks vitality. Christie enjoyed a richer life than learners will glean from this prim phantom: she was a wartime nurse (hence her deep knowledge of toxins), thwarted opera singer, keen surfer and archaeological expert who joined her second husband on digs in Iraq. Furthermore, her juiciest mysteries smash crime-writing rules. The narrator does it; the detective does it; all the suspects do it. Sometimes there’s no detective: in “The Hollow” (1946) Christie regretted that Poirot appeared at all. With its working-class antihero and gothic darkness, “Endless Night” (1967) shatters every Christie cliché. This high-tech, retrofitted version of the author feels smaller and flatter than the ingenious original. ■
Agatha Christie teaches a new writing course—with help from AI
May 8th 2025
EVERYONE HAS a book inside them, or so the saying goes. In this day and age, those who want help coaxing the story out can receive instruction online from some of the world’s most popular authors. Lee Child and Harlan Coben, who have sold hundreds of millions of books between them, teach thriller writing; Jojo Moyes offers tips on romance yarns. And now Agatha Christie, the world’s bestselling writer of fiction, with more than 2bn copies sold, is instructing viewers in the art of the whodunnit—even though she died in 1976.
Christie’s course is the result not of recently unearthed archival footage, but artificial intelligence. BBC Maestro, an online education platform, brought the idea to the Christie family, which still controls 36% of Agatha Christie Ltd (AMC Networks, an entertainment giant, owns the rest). They consented to bring the “Queen of Crime” back to life, to teach the mysterious flair of her style.
A team of almost 100—including Christie scholars as well as AI specialists—worked on the project. Vivien Keene, an actor, provided a stand-in for the author; Christie’s face was mapped on top. Crucially, Ms Keene’s eerily credible performance employs only Christie’s words: a tapestry of extracts from her own writings, notebooks and interviews.
In this way, the creator of Hercule Poirot and Miss Marple shares handy writing tips, such as the neatest ways to dispatch fictional victims. Firearms bring ballistic complications. Be wary of poisons, as each works in a unique way. Novice authors can “always rely on a dull blow to the head”.
Many of Christie’s writing rules concern playing fair. She practised misdirection and laid “false clues” alongside true ones, but insisted that her plots do not cheat or hide key evidence: “I never deceive my readers.” In sections devoted to plot and setting, she explains how to plant key clues “in plain sight” and plan events with detailed “maps and diagrams”. She advises viewers to watch and listen to strangers on buses or in shops and to spice up motives for murder with a love triangle.
Some of the most engaging sections come from “An Autobiography”, published posthumously in 1977: Poirot’s origins among the Belgian refugees who reached Devon during the first world war, or fond memories of her charismatic, feckless brother Monty, who had “broken the laws of a lot of countries” and provided the inspiration for many of Christie’s “wayward young male figures”.
By relying on Christie’s own words, BBC Maestro hopes to avoid charges of creepy pedagogical deepfakery. At the same time, it is that focus on quotation which limits the course’s value as a creative-writing toolbox. The woman born Agatha Miller in 1890 speaks from her own time and place. She tells wannabe writers to use snowstorms to isolate murder scenes (as they bring down telephone wires) and cites the clue-generating value of railway timetables, ink stains and cut-up newspapers. These charming details are irrelevant to modern scribblers.
Yet anachronism is not the course’s biggest flaw: it is that it lacks vitality. Christie enjoyed a richer life than learners will glean from this prim phantom: she was a wartime nurse (hence her deep knowledge of toxins), thwarted opera singer, keen surfer and archaeological expert who joined her second husband on digs in Iraq. Furthermore, her juiciest mysteries smash crime-writing rules. The narrator does it; the detective does it; all the suspects do it. Sometimes there’s no detective: in “The Hollow” (1946) Christie regretted that Poirot appeared at all. With its working-class antihero and gothic darkness, “Endless Night” (1967) shatters every Christie cliché. This high-tech, retrofitted version of the author feels smaller and flatter than the ingenious original. ■
The language that changed the world
Around half of the world’s population speaks a descendant of Proto-Indo-European. Most know little about it
May 8th 2025
Dante did like a category. Famously, he split sinners into different circles in Hell. Those who committed milder misdemeanours (the violent; tyrants) went in the outer realms; more serious sinners (bankers, naturally) he tortured farther in.
But he also categorised other things. Because Dante—a linguistic Darwin—held a heretical theory. Europe’s languages, he thought, had not been made in a single moment, in a biblical Babel, but had evolved. With the care of a Victorian naturalist, he thus split Europe’s languages into families by their words for yes: thus those who lived in southern France and said “Oc” spoke a “Langue d’oc”.
The truth of Dante’s Hell is debatable, but, linguistically, he was spot-on. Four centuries later, a British judge and polyglot called Sir William Jones arrived in Calcutta and was struck by the similarities between Latin and Sanskrit words for terms such as house (domus, dam) and god (deus, deva). Clearly they had “sprung from some common source, which, perhaps, no longer exists”. The world’s languages were not a babel but a brotherhood.
“Proto”, a new book by Laura Spinney, a journalist who has written for this newspaper, offers a biography of that brotherhood—or rather its parent. For Jones’s “common source” now has a name: “Proto-Indo-European” (PIE). It was first spoken by as little as a few dozen people around the Black Sea then, roughly 5,000 years ago, spread with rapidity “from Ireland to India”. Today, its offshoots include Irish and Hindi—and more or less everything in between. Almost half of the world’s population speaks a descendant of it.
PIE is long dead, but traces of it remain caught, like insects in amber, in modern languages, allowing academics to bring it back to life. Scholars know its speakers (perhaps) had the wheel (*kwekwlos, in PIE’s odd transcription, seen in English “circle” and “wheel” itself); and fields (*h2egros—“agriculture”) and drank mead (*medhu). They know speakers found visitors irritating: the PIE word “*ghostis” gives English not just “guest” but “ghost”, and Latin its word for “host” but also “enemy”. (Everyone has had a guest like that.) Its ancient wordscapes are enlightening, if at times puzzling. A hero was “one who urinated standing up”—which feels like a low bar.
This book is at its best on the language: to learn that English “mother”, Latin mater and Sanskrit mata share a root provokes a pleasing etymological “Ah!” Its (lengthy) agricultural sections are drier and contain too many mentions of the word “goat”. Topics, like guests, can outstay their welcome. But logophiles will enjoy getting to know a little more about their *meh2ter (mother) tongue. ■
Around half of the world’s population speaks a descendant of Proto-Indo-European. Most know little about it
May 8th 2025
Dante did like a category. Famously, he split sinners into different circles in Hell. Those who committed milder misdemeanours (the violent; tyrants) went in the outer realms; more serious sinners (bankers, naturally) he tortured farther in.
But he also categorised other things. Because Dante—a linguistic Darwin—held a heretical theory. Europe’s languages, he thought, had not been made in a single moment, in a biblical Babel, but had evolved. With the care of a Victorian naturalist, he thus split Europe’s languages into families by their words for yes: thus those who lived in southern France and said “Oc” spoke a “Langue d’oc”.
The truth of Dante’s Hell is debatable, but, linguistically, he was spot-on. Four centuries later, a British judge and polyglot called Sir William Jones arrived in Calcutta and was struck by the similarities between Latin and Sanskrit words for terms such as house (domus, dam) and god (deus, deva). Clearly they had “sprung from some common source, which, perhaps, no longer exists”. The world’s languages were not a babel but a brotherhood.
“Proto”, a new book by Laura Spinney, a journalist who has written for this newspaper, offers a biography of that brotherhood—or rather its parent. For Jones’s “common source” now has a name: “Proto-Indo-European” (PIE). It was first spoken by as little as a few dozen people around the Black Sea then, roughly 5,000 years ago, spread with rapidity “from Ireland to India”. Today, its offshoots include Irish and Hindi—and more or less everything in between. Almost half of the world’s population speaks a descendant of it.
PIE is long dead, but traces of it remain caught, like insects in amber, in modern languages, allowing academics to bring it back to life. Scholars know its speakers (perhaps) had the wheel (*kwekwlos, in PIE’s odd transcription, seen in English “circle” and “wheel” itself); and fields (*h2egros—“agriculture”) and drank mead (*medhu). They know speakers found visitors irritating: the PIE word “*ghostis” gives English not just “guest” but “ghost”, and Latin its word for “host” but also “enemy”. (Everyone has had a guest like that.) Its ancient wordscapes are enlightening, if at times puzzling. A hero was “one who urinated standing up”—which feels like a low bar.
This book is at its best on the language: to learn that English “mother”, Latin mater and Sanskrit mata share a root provokes a pleasing etymological “Ah!” Its (lengthy) agricultural sections are drier and contain too many mentions of the word “goat”. Topics, like guests, can outstay their welcome. But logophiles will enjoy getting to know a little more about their *meh2ter (mother) tongue. ■
Stability, Kindleberger argued, is a global public good that must be provided. It is not a naturally occurring equilibrium. The leading economy—a “hegemon”, as later thinkers would term it—can capture some of the benefits of this stability for itself, and push the system in a direction favourable to its interests. However, it needs to take on the burden of providing, among other things, an open market for goods, countercyclical finance and the role of lender of last resort. President Donald Trump now appears to reject this thinking altogether. He demands that allies pay for military protection and views a trade deficit as straightforward evidence of being ripped off. Members of his administration have mooted charging countries for the privilege of lending to the American government. All told, he simply does not see the gains that emerge from global stability as being worth their cost.
Hélène Rey of the London Business School identifies a “New Kindleberger Gap”. This time a “self-destructing hegemon” is, she says, uninterested in providing global public goods, while an ascendent one (Ms Rey refers to the European Union, but China is another candidate) lacks the ability. The Fed’s swap lines lie at the heart of her concerns. These offer central banks in allied countries, including the European Central Bank, the Bank of England and the Bank of Japan, access to dollars in exchange for their own currency. They should help forestall any crisis that bids up the price of dollar borrowing, but are just the sort of burden-sharing to which Mr Trump normally objects. In an attempt to bolster their position, sophisticated policymakers are talking in Trumpian terms. “The reason we do it is it’s really good for US consumers,” Jerome Powell, chairman of the Fed, has said.
What are the contingency options? Despite Mr Powell’s assurance that America will continue to offer swap lines, Ms Rey suggests that European central banks should encourage commercial lenders to reduce exposure to dollar assets, build up precautionary dollar reserves and play a part in turning the euro into an international currency. Robert McCauley of Boston University advocates the creation of a “dollar coalition of the willing”, pointing out that the central banks which would normally receive swap lines from America already have $1.9trn-worth of dollar-reserve assets, which they could agree to pool in advance of a crisis. That amount is far more than they borrowed from the Fed during either the global financial crisis of 2007-09 or the early stages of the covid-19 pandemic. In the short term, such actions may help cement the role of the dollar, as central banks build up reserves. In the longer term, however, it may mean that American monetary hegemony becomes a subject fit only for historical economics. ■
Hélène Rey of the London Business School identifies a “New Kindleberger Gap”. This time a “self-destructing hegemon” is, she says, uninterested in providing global public goods, while an ascendent one (Ms Rey refers to the European Union, but China is another candidate) lacks the ability. The Fed’s swap lines lie at the heart of her concerns. These offer central banks in allied countries, including the European Central Bank, the Bank of England and the Bank of Japan, access to dollars in exchange for their own currency. They should help forestall any crisis that bids up the price of dollar borrowing, but are just the sort of burden-sharing to which Mr Trump normally objects. In an attempt to bolster their position, sophisticated policymakers are talking in Trumpian terms. “The reason we do it is it’s really good for US consumers,” Jerome Powell, chairman of the Fed, has said.
What are the contingency options? Despite Mr Powell’s assurance that America will continue to offer swap lines, Ms Rey suggests that European central banks should encourage commercial lenders to reduce exposure to dollar assets, build up precautionary dollar reserves and play a part in turning the euro into an international currency. Robert McCauley of Boston University advocates the creation of a “dollar coalition of the willing”, pointing out that the central banks which would normally receive swap lines from America already have $1.9trn-worth of dollar-reserve assets, which they could agree to pool in advance of a crisis. That amount is far more than they borrowed from the Fed during either the global financial crisis of 2007-09 or the early stages of the covid-19 pandemic. In the short term, such actions may help cement the role of the dollar, as central banks build up reserves. In the longer term, however, it may mean that American monetary hegemony becomes a subject fit only for historical economics. ■
What happens when a hegemon falls?
Why economists are turning to a 50-year-old book on the Depression
May 8th 2025
The “Kindleberger Spiral”, a graph of world trade between 1929 and 1933, looks like water circling a drain, or a small animal curling up into a ball. It was produced by Charles Kindleberger, an economic historian, in “The World in Depression”, a book published in 1973, and has recently enjoyed a new lease of life as a demonstration of the self-harm that protectionism inflicts. From month to month, Kindleberger charted how the global economy turned in on itself throughout the late-1920s and 1930s, spiralling towards disaster. Another idea from his work—the “Kindleberger gap”, referring to a leadership void—is also proving helpful.
Kindleberger had a front-row seat for the Depression. As a graduate student completing his thesis in the 1930s, he worked at the US Treasury for Harry Dexter White, chief architect of the post-first-world-war system of fixed exchange rates. Graduation led to a job at New York Federal Reserve. After the second world war, during which he worked at the Office of Strategic Services, a precursor to the CIA, he moved to the State Department, where he helped shape the Marshall Plan, America’s programme for the reconstruction of Europe. In time he found his way to academia—he had probably had enough excitement, his biographer speculates—becoming one of the first members of the economics department at the Massachusetts Institute of Technology.
At MIT, Kindleberger was something of a pre-war figure in a post-war world. He was not a mathematical-model builder in the mould of Paul Samuelson and Robert Solow, two supremely talented colleagues. Instead, he followed a methodology he called historical economics, not economic history. “It is better, I believe, to err on the side of an artistic feel for the relationships and the data,” he wrote. Despite this, in 2009, it was his work to which Larry Summers turned as he co-ordinated America’s response to the financial crisis while director of the National Economic Council.
“The World in Depression” answers fundamental questions: “How and where the Depression originated, why it spread so widely and why it went so deep and lasted so long.” The book starts with the venomous diplomacy of first-world-war debts and reparations, travels through the stockmarket crash of 1929, the turn to protectionism, subsequent bank failures and the seemingly never-ending economic slump, until it concludes with German rearmament—stopping short of the second world war.
Kindleberger’s conclusion is that the Depression was such a disaster because the global economy lacked a leading nation to stabilise it. “Britain could not and America would not,” he wrote. Britain, which under the gold standard was the dominant economic as well as military power, was exhausted by the first world war. America was isolationist, protectionist and overrun by hard-money thinking, which insisted on balanced budgets and a gold peg. France was too small to stabilise the world but big enough to destabilise it, he wrote, as when the country attached conditions to bail-outs or dug in its heels over German reparations. The Kindleberger gap refers to this void of economic leadership.
Why economists are turning to a 50-year-old book on the Depression
May 8th 2025
The “Kindleberger Spiral”, a graph of world trade between 1929 and 1933, looks like water circling a drain, or a small animal curling up into a ball. It was produced by Charles Kindleberger, an economic historian, in “The World in Depression”, a book published in 1973, and has recently enjoyed a new lease of life as a demonstration of the self-harm that protectionism inflicts. From month to month, Kindleberger charted how the global economy turned in on itself throughout the late-1920s and 1930s, spiralling towards disaster. Another idea from his work—the “Kindleberger gap”, referring to a leadership void—is also proving helpful.
Kindleberger had a front-row seat for the Depression. As a graduate student completing his thesis in the 1930s, he worked at the US Treasury for Harry Dexter White, chief architect of the post-first-world-war system of fixed exchange rates. Graduation led to a job at New York Federal Reserve. After the second world war, during which he worked at the Office of Strategic Services, a precursor to the CIA, he moved to the State Department, where he helped shape the Marshall Plan, America’s programme for the reconstruction of Europe. In time he found his way to academia—he had probably had enough excitement, his biographer speculates—becoming one of the first members of the economics department at the Massachusetts Institute of Technology.
At MIT, Kindleberger was something of a pre-war figure in a post-war world. He was not a mathematical-model builder in the mould of Paul Samuelson and Robert Solow, two supremely talented colleagues. Instead, he followed a methodology he called historical economics, not economic history. “It is better, I believe, to err on the side of an artistic feel for the relationships and the data,” he wrote. Despite this, in 2009, it was his work to which Larry Summers turned as he co-ordinated America’s response to the financial crisis while director of the National Economic Council.
“The World in Depression” answers fundamental questions: “How and where the Depression originated, why it spread so widely and why it went so deep and lasted so long.” The book starts with the venomous diplomacy of first-world-war debts and reparations, travels through the stockmarket crash of 1929, the turn to protectionism, subsequent bank failures and the seemingly never-ending economic slump, until it concludes with German rearmament—stopping short of the second world war.
Kindleberger’s conclusion is that the Depression was such a disaster because the global economy lacked a leading nation to stabilise it. “Britain could not and America would not,” he wrote. Britain, which under the gold standard was the dominant economic as well as military power, was exhausted by the first world war. America was isolationist, protectionist and overrun by hard-money thinking, which insisted on balanced budgets and a gold peg. France was too small to stabilise the world but big enough to destabilise it, he wrote, as when the country attached conditions to bail-outs or dug in its heels over German reparations. The Kindleberger gap refers to this void of economic leadership.
Warren Buffett has created a $348bn question for his successor
Berkshire Hathaway’s next CEO has huge shoes to fill—and a mountain of cash to invest
May 8th 2025
SURPRISING PEOPLE at the age of 94 is no mean feat. But that is what Warren Buffett did at the end of Berkshire Hathaway’s annual shareholder meeting on May 3rd, when he announced that he would be stepping down as the company’s chief executive at the end of the year. Mr Buffett gave most of Berkshire’s directors no advance notice of his decision. Nor did he tell Greg Abel, his presumptive successor.
Berkshire Hathaway was a textile-maker when Mr Buffett bought it in 1965. In the years that followed he turned it into an immense insurance firm and a sprawling conglomerate with interests in everything from energy to sweets. He employed a value-investing strategy, seeking out companies that appeared cheap relative to their intrinsic value. From 1965 to the end of last year, Berkshire’s market value had increased by more than 5,500,000%, with a compounded annual return of almost 20%. The total return of the S&P 500 index over the same period was 39,000%.
Today Berkshire has a market capitalisation of $1.1trn, having dropped a bit on Mr Buffett’s announcement. Mr Abel has been with the company for a quarter of a century. He has run its non-insurance operations—such as its energy, railway and retail businesses—since 2018. His next challenge goes beyond filling Mr Buffett’s shoes as an “oracle”. Berkshire’s strategy is becoming more difficult to pull off.
Over the past year, Mr Buffett has aggressively sold stocks, including a large chunk of its stake in Apple. Now, for the first time in two decades, Berkshire owns more cash than listed equities. At the end of March it held $348bn in cash and short-term American government debt, more than twice the amount it held at the close of 2023. The firm’s Treasuries account for 5% of the outstanding market. If Berkshire was a country, it would be the tenth-largest holder of American government debt, above India, Switzerland and Taiwan.
Mr Buffett’s decision to withdraw from the stockmarket has so far benefited the company. Its stock is up by 14% this year, while the S&P 500 is down by 4%. The problem, for Messrs Buffett and Abel, is working out what to do with the enormous pile of cash. Lately Mr Buffett has griped that there is not much out there to buy at a reasonable price. Even after the recent market tumult, valuations of listed companies are high relative to their historical levels.
One option would be to expand more aggressively overseas. In recent years Mr Buffett has made successful bets abroad. For example, he has poured billions of dollars into several of Japan’s trading conglomerates, such as Mitsubishi and Sumitomo. Mr Abel might note that among companies worth over $5bn and with price-to-earnings ratios below ten—suggesting they are cheaply valued—80% by value are domiciled outside America.
Another option would be to stray from value investing in the hope of finding firms worthy of capital allocation. That seems unlikely, at least for now. Such a move would transform Berkshire’s culture and risk the ire of Mr Buffett’s admirers. After 25 years at the firm, Mr Abel is unlikely to pull an immediate handbrake turn.
Absent a change of strategy, Berkshire will have to wait for a market downturn. Mr Buffett has a history of spotting such opportunities. He snapped up a large stake in Wells Fargo, an American bank, during a slump in 1990. He invested in companies including Johnson & Johnson and Kraft Foods (and Wells Fargo again) following the global financial crisis of 2007-09. The list goes on. Berkshire’s shareholders must hope that Mr Abel has the same vision. ■
For more expert analysis of the biggest stories in economics, finance and markets, sign up to Money Talks, our weekly subscriber-only newsletter.
Berkshire Hathaway’s next CEO has huge shoes to fill—and a mountain of cash to invest
May 8th 2025
SURPRISING PEOPLE at the age of 94 is no mean feat. But that is what Warren Buffett did at the end of Berkshire Hathaway’s annual shareholder meeting on May 3rd, when he announced that he would be stepping down as the company’s chief executive at the end of the year. Mr Buffett gave most of Berkshire’s directors no advance notice of his decision. Nor did he tell Greg Abel, his presumptive successor.
Berkshire Hathaway was a textile-maker when Mr Buffett bought it in 1965. In the years that followed he turned it into an immense insurance firm and a sprawling conglomerate with interests in everything from energy to sweets. He employed a value-investing strategy, seeking out companies that appeared cheap relative to their intrinsic value. From 1965 to the end of last year, Berkshire’s market value had increased by more than 5,500,000%, with a compounded annual return of almost 20%. The total return of the S&P 500 index over the same period was 39,000%.
Today Berkshire has a market capitalisation of $1.1trn, having dropped a bit on Mr Buffett’s announcement. Mr Abel has been with the company for a quarter of a century. He has run its non-insurance operations—such as its energy, railway and retail businesses—since 2018. His next challenge goes beyond filling Mr Buffett’s shoes as an “oracle”. Berkshire’s strategy is becoming more difficult to pull off.
Over the past year, Mr Buffett has aggressively sold stocks, including a large chunk of its stake in Apple. Now, for the first time in two decades, Berkshire owns more cash than listed equities. At the end of March it held $348bn in cash and short-term American government debt, more than twice the amount it held at the close of 2023. The firm’s Treasuries account for 5% of the outstanding market. If Berkshire was a country, it would be the tenth-largest holder of American government debt, above India, Switzerland and Taiwan.
Mr Buffett’s decision to withdraw from the stockmarket has so far benefited the company. Its stock is up by 14% this year, while the S&P 500 is down by 4%. The problem, for Messrs Buffett and Abel, is working out what to do with the enormous pile of cash. Lately Mr Buffett has griped that there is not much out there to buy at a reasonable price. Even after the recent market tumult, valuations of listed companies are high relative to their historical levels.
One option would be to expand more aggressively overseas. In recent years Mr Buffett has made successful bets abroad. For example, he has poured billions of dollars into several of Japan’s trading conglomerates, such as Mitsubishi and Sumitomo. Mr Abel might note that among companies worth over $5bn and with price-to-earnings ratios below ten—suggesting they are cheaply valued—80% by value are domiciled outside America.
Another option would be to stray from value investing in the hope of finding firms worthy of capital allocation. That seems unlikely, at least for now. Such a move would transform Berkshire’s culture and risk the ire of Mr Buffett’s admirers. After 25 years at the firm, Mr Abel is unlikely to pull an immediate handbrake turn.
Absent a change of strategy, Berkshire will have to wait for a market downturn. Mr Buffett has a history of spotting such opportunities. He snapped up a large stake in Wells Fargo, an American bank, during a slump in 1990. He invested in companies including Johnson & Johnson and Kraft Foods (and Wells Fargo again) following the global financial crisis of 2007-09. The list goes on. Berkshire’s shareholders must hope that Mr Abel has the same vision. ■
For more expert analysis of the biggest stories in economics, finance and markets, sign up to Money Talks, our weekly subscriber-only newsletter.
If models become more efficient still, there are yet more uses to which they can be put. In recent months, several AI labs have launched “Deep Research” tools, combining reasoning models with the ability to search the web for information and set themselves follow-up tasks. The tools are one of the first mainstream examples of what the AI industry calls “agents”, quasi-autonomous AI systems that can carry out many tasks sequentially. And because it takes them between five and 30 minutes to give a response, running such an agent uses more energy than asking a simple query.
Such efficiency gains leave some wary of the Jevons paradox popping up in other industries. Lynn Kaack, who leads the AI and Climate Technology Policy Group at the Hertie School in Berlin, worries that, by increasing efficiency and reducing costs in areas like shipping, AI will incentivise companies to increase their activity.
Those concerned about the trajectory of AI’s environmental costs are looking for ways to alter it. Mr Gamazaychikov, for instance, hopes that his effort to rank various AI models will allow users and businesses to find the most efficient one for any given task, rather than always using the “best”.
But the closed nature of the biggest labs complicate things. OpenAI, for instance, gives away access to its top-tier models below cost, according to Sam Altman, its boss; Google and Amazon charge less for access to their own AI systems than the cost of the electricity alone, insiders claim. That means users have less motivation to hunt for the most efficient model than they would if they had to pay the true cost of their use. And greater transparency around efficiency and emissions may not result in meaningful behavioural change: after all, there is little evidence to show that growing awareness of the carbon cost of flying has stopped people taking flights.
Coming clean
Many observers think that the best way forward is through tighter regulation, both of AI itself and of the energy it consumes. The first has had limited success in Europe—from the summer of 2026, developers of “high risk” AI will need to tell regulators about the energy it consumes—and is struggling to get off the ground almost everywhere else. In America the Trump administration’s bonfire of red tape means voluntary efficiency drives are more likely than new regulations.
That said, trying to regulate the development of AI specifically is not the only option: broader policies meant to motivate emissions cuts, such as carbon pricing, can help too. Arguably the most important change will come from speeding up the transition to clean energy, and boosting the amount available so that demand for greener AI does not gobble up the low-carbon electricity also needed to decarbonise other sectors, from transportation to construction. Figuring out how to do that shouldn’t require Deep Research. ■
Such efficiency gains leave some wary of the Jevons paradox popping up in other industries. Lynn Kaack, who leads the AI and Climate Technology Policy Group at the Hertie School in Berlin, worries that, by increasing efficiency and reducing costs in areas like shipping, AI will incentivise companies to increase their activity.
Those concerned about the trajectory of AI’s environmental costs are looking for ways to alter it. Mr Gamazaychikov, for instance, hopes that his effort to rank various AI models will allow users and businesses to find the most efficient one for any given task, rather than always using the “best”.
But the closed nature of the biggest labs complicate things. OpenAI, for instance, gives away access to its top-tier models below cost, according to Sam Altman, its boss; Google and Amazon charge less for access to their own AI systems than the cost of the electricity alone, insiders claim. That means users have less motivation to hunt for the most efficient model than they would if they had to pay the true cost of their use. And greater transparency around efficiency and emissions may not result in meaningful behavioural change: after all, there is little evidence to show that growing awareness of the carbon cost of flying has stopped people taking flights.
Coming clean
Many observers think that the best way forward is through tighter regulation, both of AI itself and of the energy it consumes. The first has had limited success in Europe—from the summer of 2026, developers of “high risk” AI will need to tell regulators about the energy it consumes—and is struggling to get off the ground almost everywhere else. In America the Trump administration’s bonfire of red tape means voluntary efficiency drives are more likely than new regulations.
That said, trying to regulate the development of AI specifically is not the only option: broader policies meant to motivate emissions cuts, such as carbon pricing, can help too. Arguably the most important change will come from speeding up the transition to clean energy, and boosting the amount available so that demand for greener AI does not gobble up the low-carbon electricity also needed to decarbonise other sectors, from transportation to construction. Figuring out how to do that shouldn’t require Deep Research. ■
Put to good use
Focusing on the energy impact of training models, however, may be a distraction. Boris Gamazaychikov, who is in charge of AI sustainability at Salesforce, a software company, compares it to trying to estimate the carbon footprint of a flight by including the impact of building the plane itself. Not only is that construction cost tiny compared with the fuel used over a typical lifetime in service, it’s also impossible to calculate the per-passenger impact until the aircraft is finally retired.
Instead, he says, it is best to focus on the energy impact of using AI, a process called inference. Brent Thill of Jefferies, an analyst, estimates that this stage accounts for 96% of the overall energy consumed in data centres used by the AI industry. Mr Gamazaychikov is trying to put hard numbers on that side of the industry, working with HuggingFace, an AI cloud provider, to systematically test the efficiency of hundreds of AI models. The results show the difficulty of generalising: the difference between the most and least power-hungry models is more than 60,000-fold.
Some of that difference arises from the AI models’ varying purposes. The most efficient model tested, called BERT-tiny, draws just 0.06 watt-hours (Wh) per task—about a second’s worth on an exercise bike—but is useful only for simple text-manipulation tasks. Even the least power-hungry image-generation model tested, by contrast, requires 3,000 times as much electricity to produce a single image.
All the same, says Sasha Luccioni of HuggingFace, concrete figures are not always available. Her company could test only the models it could download and run on its own hardware. “OpenAI has not released a single metric about ChatGPT,” Ms Luccioni says, even though such data exist.
Another difficulty in calculating energy use is the fact that AI models are rapidly evolving. The release of DeepSeek V3 in December, a top-tier AI model made by a lab spun off from a Chinese hedge fund, initially looked like good news for those concerned about the industry’s energy use. A raft of improvements meant that the final training run was more than ten times faster than that of Meta’s Llama 3.3 model just a few weeks earlier, with a roughly proportionate reduction in power used. Inference also became less power-hungry.
In January, as the implications of that improvement became clear, the stock prices of chipmakers crashed. But Satya Nadella, the boss of Microsoft, predicted the upset would be brief, citing the Jevons paradox, a 19th-century observation that the rising efficiency of steam engines opened up new economic uses for the technology and thereby raised demand for coal.
For AI, the rebound effect arrived in the form of “reasoning” models, including DeepSeek’s follow-up model, R1. If normal chatbots exhibit what Daniel Kahneman, a psychologist and Nobel economics laureate, called “type one” thinking—prioritising speedy responses—reasoning models display “type two”: structured replies that attempt to break a problem into its constituent parts, solve it with a variety of approaches, and check their answer is correct before settling on it as the final response.
Training a reasoning model is not much harder than training a normal AI system, especially if you have pre-existing models to learn from. But running it requires significantly more power, since the “reasoning” step, in which the problem is thought through before a final answer is reached, takes longer. The efficiency improvements DeepSeek pioneered in V3 were more than eaten up by the extra thinking time used by R1 a couple of months later.
Focusing on the energy impact of training models, however, may be a distraction. Boris Gamazaychikov, who is in charge of AI sustainability at Salesforce, a software company, compares it to trying to estimate the carbon footprint of a flight by including the impact of building the plane itself. Not only is that construction cost tiny compared with the fuel used over a typical lifetime in service, it’s also impossible to calculate the per-passenger impact until the aircraft is finally retired.
Instead, he says, it is best to focus on the energy impact of using AI, a process called inference. Brent Thill of Jefferies, an analyst, estimates that this stage accounts for 96% of the overall energy consumed in data centres used by the AI industry. Mr Gamazaychikov is trying to put hard numbers on that side of the industry, working with HuggingFace, an AI cloud provider, to systematically test the efficiency of hundreds of AI models. The results show the difficulty of generalising: the difference between the most and least power-hungry models is more than 60,000-fold.
Some of that difference arises from the AI models’ varying purposes. The most efficient model tested, called BERT-tiny, draws just 0.06 watt-hours (Wh) per task—about a second’s worth on an exercise bike—but is useful only for simple text-manipulation tasks. Even the least power-hungry image-generation model tested, by contrast, requires 3,000 times as much electricity to produce a single image.
All the same, says Sasha Luccioni of HuggingFace, concrete figures are not always available. Her company could test only the models it could download and run on its own hardware. “OpenAI has not released a single metric about ChatGPT,” Ms Luccioni says, even though such data exist.
Another difficulty in calculating energy use is the fact that AI models are rapidly evolving. The release of DeepSeek V3 in December, a top-tier AI model made by a lab spun off from a Chinese hedge fund, initially looked like good news for those concerned about the industry’s energy use. A raft of improvements meant that the final training run was more than ten times faster than that of Meta’s Llama 3.3 model just a few weeks earlier, with a roughly proportionate reduction in power used. Inference also became less power-hungry.
In January, as the implications of that improvement became clear, the stock prices of chipmakers crashed. But Satya Nadella, the boss of Microsoft, predicted the upset would be brief, citing the Jevons paradox, a 19th-century observation that the rising efficiency of steam engines opened up new economic uses for the technology and thereby raised demand for coal.
For AI, the rebound effect arrived in the form of “reasoning” models, including DeepSeek’s follow-up model, R1. If normal chatbots exhibit what Daniel Kahneman, a psychologist and Nobel economics laureate, called “type one” thinking—prioritising speedy responses—reasoning models display “type two”: structured replies that attempt to break a problem into its constituent parts, solve it with a variety of approaches, and check their answer is correct before settling on it as the final response.
Training a reasoning model is not much harder than training a normal AI system, especially if you have pre-existing models to learn from. But running it requires significantly more power, since the “reasoning” step, in which the problem is thought through before a final answer is reached, takes longer. The efficiency improvements DeepSeek pioneered in V3 were more than eaten up by the extra thinking time used by R1 a couple of months later.
The tricky task of calculating AI’s energy use
Making models less thirsty may not lessen their environmental impact
April 9th 2025
A fifth of all electricity used in Ireland is spent powering the country’s data centres, more than is used by its urban homes. With one data centre for every 42,000-odd people, Ireland has one of the highest per-person concentrations of computing power in the world. Loudoun County, just outside Washington, DC, beats it: its 443,000 residents rub shoulders with scores of data centres—more than the next six biggest clusters in America combined. In 2022 their peak energy usage was almost 3 gigawatts (GW), a power draw that, if maintained year round, would approach Ireland’s total annual consumption.
Around 1.5% of global electricity is spent on powering data centres. Most of that is for storing and processing data for everything from streaming video to financial transactions. But artificial intelligence (AI) will make up much of future data-centre demand. By 2038 Dominion, a power company, expects the data centres in Loudoun County alone to need more than 13GW. The International Energy Agency, a forecaster, estimates that global data-centre power demand could increase by between 128% and 203% by 2030, mostly because of AI-related energy consumption.
Big tech is confident that the environmental benefits justify the costs. “AI is going to be one of the main drivers of solutions to the climate situation,” says Demis Hassabis, the boss of Google DeepMind. Others disagree. This week’s special section explores the arguments in detail. It examines the ways in which AI can help clean up some of the most polluting industries, including energy production and heavy industry, and discusses the possibility of moving data centres off Earth altogether. It will also examine why AI’s energy footprint is so hard to quantify, and what its true environmental impact might be.
Tech firms are generally unwilling to share information about their AI models. One indirect way to estimate the environmental impact of building and deploying AI models, therefore, is to look at the firms’ self-reported carbon emissions. Google’s greenhouse-gas emissions rose by almost half between 2019 and 2023, according to the search giant, primarily because of increases in the energy consumption of data centres and supply-chain emissions. Microsoft’s emissions jumped by roughly a third in 2023, compared with three years earlier, partly due to its own focus on AI.
None
Another approach to estimating AI’s environmental footprint is to add up the energy use of the infrastructure used to build the models themselves. Meta’s Llama 3.1, a large language model (LLM), for example, was trained using chips from Nvidia which can draw 700 watts of power each, around half that of a fancy kettle, and it ran those chips for a cumulative 39.3m hours. The resulting energy used, 27.5 gigawatt-hours (GWh), is enough to supply 7,500 homes with a year’s worth of power.
Tech companies, perhaps unsurprisingly, are keen to argue that this energy bill is not nearly as outlandish as it might appear. The immediate climate impact of the final Llama 3.3 training run, Meta estimates, is emissions worth 11,390 tonnes of CO2—about the same as 60 fully loaded return flights between London and New York. Those are the emissions, at least, of the power grid that supplied the company’s data centre. But Meta argues that, since electrons are fungible, if enough renewable energy is bought on the opposite side of the country—or even at another time altogether—the true emissions fall to zero.
Making models less thirsty may not lessen their environmental impact
April 9th 2025
A fifth of all electricity used in Ireland is spent powering the country’s data centres, more than is used by its urban homes. With one data centre for every 42,000-odd people, Ireland has one of the highest per-person concentrations of computing power in the world. Loudoun County, just outside Washington, DC, beats it: its 443,000 residents rub shoulders with scores of data centres—more than the next six biggest clusters in America combined. In 2022 their peak energy usage was almost 3 gigawatts (GW), a power draw that, if maintained year round, would approach Ireland’s total annual consumption.
Around 1.5% of global electricity is spent on powering data centres. Most of that is for storing and processing data for everything from streaming video to financial transactions. But artificial intelligence (AI) will make up much of future data-centre demand. By 2038 Dominion, a power company, expects the data centres in Loudoun County alone to need more than 13GW. The International Energy Agency, a forecaster, estimates that global data-centre power demand could increase by between 128% and 203% by 2030, mostly because of AI-related energy consumption.
Big tech is confident that the environmental benefits justify the costs. “AI is going to be one of the main drivers of solutions to the climate situation,” says Demis Hassabis, the boss of Google DeepMind. Others disagree. This week’s special section explores the arguments in detail. It examines the ways in which AI can help clean up some of the most polluting industries, including energy production and heavy industry, and discusses the possibility of moving data centres off Earth altogether. It will also examine why AI’s energy footprint is so hard to quantify, and what its true environmental impact might be.
Tech firms are generally unwilling to share information about their AI models. One indirect way to estimate the environmental impact of building and deploying AI models, therefore, is to look at the firms’ self-reported carbon emissions. Google’s greenhouse-gas emissions rose by almost half between 2019 and 2023, according to the search giant, primarily because of increases in the energy consumption of data centres and supply-chain emissions. Microsoft’s emissions jumped by roughly a third in 2023, compared with three years earlier, partly due to its own focus on AI.
None
Another approach to estimating AI’s environmental footprint is to add up the energy use of the infrastructure used to build the models themselves. Meta’s Llama 3.1, a large language model (LLM), for example, was trained using chips from Nvidia which can draw 700 watts of power each, around half that of a fancy kettle, and it ran those chips for a cumulative 39.3m hours. The resulting energy used, 27.5 gigawatt-hours (GWh), is enough to supply 7,500 homes with a year’s worth of power.
Tech companies, perhaps unsurprisingly, are keen to argue that this energy bill is not nearly as outlandish as it might appear. The immediate climate impact of the final Llama 3.3 training run, Meta estimates, is emissions worth 11,390 tonnes of CO2—about the same as 60 fully loaded return flights between London and New York. Those are the emissions, at least, of the power grid that supplied the company’s data centre. But Meta argues that, since electrons are fungible, if enough renewable energy is bought on the opposite side of the country—or even at another time altogether—the true emissions fall to zero.
China’s economic policymaking has its own weaknesses, of course; some are mirror images of America’s. Its economy is threatened by deflation, not inflation. The country’s consumer prices declined by 0.1% in February, compared with a year earlier. And its policymakers are, if anything, too rigid in their goals and too slow to change course. Only in September last year did they turn decisively to the goal of boosting consumption to help the economy weather a long-running property slump and the forthcoming trade war.
That war has arrived with a speed and ferocity China did not anticipate. According to Goldman, a 50% hike in American tariffs (roughly the scenario China faced before it retaliated) would have cut the country’s GDP by about 1.5%. A hike of 125% will reduce it by 2.2% this year. The first 50 points, in other words, hurt more than the second or third. Exorbitant tariffs kill trade and you cannot kill the same trade twice.
These calculations cannot, however, take full account of the damage to confidence and financial-risk appetite. China’s stockmarket plummeted on April 7th, after the government chose to retaliate against Mr Trump. The country’s “national team” of state-directed banks and investment funds was obliged to step in to stabilise prices. China’s leaders have also announced that they are ready to do more to stimulate the economy if required, by cutting interest rates and bank-reserve requirements, as well as by selling more government bonds.
They will have to issue a lot of them to offset the tariff shock. Barclays, yet another bank, calculates that China would need up to 7.5trn yuan (over $1trn, or 5% of this year’s GDP) of extra stimulus on top of the easing of 2.4trn yuan it announced in March. Even that would only get growth to about 4%. To hit the government’s target of “around” 5%, the 7.5trn yuan would have to be closer to 12trn (or 9% of GDP).
Offshore bonanza
Another survival strategy for Chinese exporters is to recede upstream, out of the direct reach of American tariffs. They can sell parts and components to trading partners in neighbouring countries, where they can be incorporated into finished products for export to America. On the face of it, the incentive to pursue this strategy will be overwhelmingly strong if China remains stuck with American tariffs of over 100% while countries including Thailand and Vietnam face levies of only 10%.
One problem is that this strategy is no secret to the trade warriors in the White House. Peter Navarro, Mr Trump’s trade adviser, recently accused Vietnam of acting as a “colony” for Chinese manufacturers. “They slap a made-in-Vietnam label” on a Chinese good “and send it here to evade the tariffs”, he complained to Fox News. Vietnam could jeopardise its own access to the American market if it does not distance itself from China.
Chinese manufacturers may have doubts of their own. Even if their Asian neighbours can now seal a “bespoke” deal with Mr Trump, it could easily come unstuck in the months and years ahead. The United States-Mexico-Canada (USMCA) trade agreement has not held fast, even though Mr Trump himself signed it. What if a country’s trade surpluses with America fail to narrow in a year or two, due to larger macroeconomic forces outside the country’s direct control? Could the reciprocal tariffs return? The post-war trading rules that America helped enshrine once offered convincing answers to these doubts. They gave exporters the certainty they required to serve the world’s biggest market. That certainty has now gone for good.
No bell sounded in the world’s busiest ports when America’s tariffs came into effect. Cargo kept moving. But make no mistake, the death knell of the post-war trading order has been rung. ■
That war has arrived with a speed and ferocity China did not anticipate. According to Goldman, a 50% hike in American tariffs (roughly the scenario China faced before it retaliated) would have cut the country’s GDP by about 1.5%. A hike of 125% will reduce it by 2.2% this year. The first 50 points, in other words, hurt more than the second or third. Exorbitant tariffs kill trade and you cannot kill the same trade twice.
These calculations cannot, however, take full account of the damage to confidence and financial-risk appetite. China’s stockmarket plummeted on April 7th, after the government chose to retaliate against Mr Trump. The country’s “national team” of state-directed banks and investment funds was obliged to step in to stabilise prices. China’s leaders have also announced that they are ready to do more to stimulate the economy if required, by cutting interest rates and bank-reserve requirements, as well as by selling more government bonds.
They will have to issue a lot of them to offset the tariff shock. Barclays, yet another bank, calculates that China would need up to 7.5trn yuan (over $1trn, or 5% of this year’s GDP) of extra stimulus on top of the easing of 2.4trn yuan it announced in March. Even that would only get growth to about 4%. To hit the government’s target of “around” 5%, the 7.5trn yuan would have to be closer to 12trn (or 9% of GDP).
Offshore bonanza
Another survival strategy for Chinese exporters is to recede upstream, out of the direct reach of American tariffs. They can sell parts and components to trading partners in neighbouring countries, where they can be incorporated into finished products for export to America. On the face of it, the incentive to pursue this strategy will be overwhelmingly strong if China remains stuck with American tariffs of over 100% while countries including Thailand and Vietnam face levies of only 10%.
One problem is that this strategy is no secret to the trade warriors in the White House. Peter Navarro, Mr Trump’s trade adviser, recently accused Vietnam of acting as a “colony” for Chinese manufacturers. “They slap a made-in-Vietnam label” on a Chinese good “and send it here to evade the tariffs”, he complained to Fox News. Vietnam could jeopardise its own access to the American market if it does not distance itself from China.
Chinese manufacturers may have doubts of their own. Even if their Asian neighbours can now seal a “bespoke” deal with Mr Trump, it could easily come unstuck in the months and years ahead. The United States-Mexico-Canada (USMCA) trade agreement has not held fast, even though Mr Trump himself signed it. What if a country’s trade surpluses with America fail to narrow in a year or two, due to larger macroeconomic forces outside the country’s direct control? Could the reciprocal tariffs return? The post-war trading rules that America helped enshrine once offered convincing answers to these doubts. They gave exporters the certainty they required to serve the world’s biggest market. That certainty has now gone for good.
No bell sounded in the world’s busiest ports when America’s tariffs came into effect. Cargo kept moving. But make no mistake, the death knell of the post-war trading order has been rung. ■
Between consumers and their Calvins
China may throw some more punches of its own. It has already placed several firms, including PVH, the owner of Calvin Klein, on its list of “unreliable entities” that warrant government scrutiny and restrictions. It could now follow through and hamstring their business. It has also severed some American dronemakers from their Chinese suppliers, and curtailed exports to America of a variety of critical metals. On April 8th a list of other possible responses was posted online by several well-connected commentators. China could suspend all co-operation with America on fentanyl, for example. It could also ban imports of American poultry and other agricultural products, such as soyabeans and sorghum, which mainly come from Republican states.
China may impose restrictions on American services, too. A paper published this week by the Ministry of Commerce was at pains to point out that Uncle Sam runs a surplus with China in services trade (although it is far smaller than America’s deficit in goods trade). If China were to follow the same crude formula that America used to calculate its original reciprocal tariffs, China would be entitled to impose a levy of 28% on American services. China could also probe the intellectual property held by American firms, which may constitute monopolies earning excess profits, according to one influential blogger.
Such retaliation would make a deal with Mr Trump less likely. He seems keen to isolate China by talking to everyone else first. But from China’s point of view, talks with America’s president offer plenty of risk for little reward. America wants to “decouple” from China and contain its economic rise, whatever happens to the balance of trade. Commercial relations between the two superpowers may be at a “cyclical” low—but they are also in secular decline.
Any gains China won through talks might then be whittled away over time. The country’s leaders also have a lot to lose if discussions go awry. No adviser to Xi Jinping, China’s ruler, would risk exposing him to the kind of public humiliation meted out to Volodymyr Zelensky, Ukraine’s president, in February. A trade war is bearable. An Oval Office circus is not.
If the two superpowers do continue to fight, who will back down first? Mr Trump inherited a stretched stockmarket, but a strong economy. America’s latest job figures beat forecasts; household balance-sheets are robust. The president has done his best to squander that legacy. Before the tariff delay, JPMorgan Chase, a bank, suggested America had a 60% chance of falling into a recession and a 40% chance of taking the world economy down with it.
Those odds have presumably dropped a bit. But the tariffs that remain will still raise prices, eroding household purchasing power and, possibly, delaying any interest-rate cuts from the Federal Reserve. For over a third of products that America buys abroad, China is the dominant supplier, meeting 70% or more of America’s foreign demand, according to Goldman Sachs, another bank. The trade war will more than double the price of these goods.
Even before inflation rises, uncertainty has spiked. And that can be equally damaging to investment and spending. A daily index of trade-policy uncertainty, calculated by Dario Caldara of the Federal Reserve and others, has been over twice as high as its previous record, reached during Mr Trump’s first trade war. The president’s supporters point out that tariffs have been a consistent preoccupation of his since the 1980s. But he seems to pursue uncertainty with equal conviction. He is a mercantilist, yes, but a mercurialist above all.
China may throw some more punches of its own. It has already placed several firms, including PVH, the owner of Calvin Klein, on its list of “unreliable entities” that warrant government scrutiny and restrictions. It could now follow through and hamstring their business. It has also severed some American dronemakers from their Chinese suppliers, and curtailed exports to America of a variety of critical metals. On April 8th a list of other possible responses was posted online by several well-connected commentators. China could suspend all co-operation with America on fentanyl, for example. It could also ban imports of American poultry and other agricultural products, such as soyabeans and sorghum, which mainly come from Republican states.
China may impose restrictions on American services, too. A paper published this week by the Ministry of Commerce was at pains to point out that Uncle Sam runs a surplus with China in services trade (although it is far smaller than America’s deficit in goods trade). If China were to follow the same crude formula that America used to calculate its original reciprocal tariffs, China would be entitled to impose a levy of 28% on American services. China could also probe the intellectual property held by American firms, which may constitute monopolies earning excess profits, according to one influential blogger.
Such retaliation would make a deal with Mr Trump less likely. He seems keen to isolate China by talking to everyone else first. But from China’s point of view, talks with America’s president offer plenty of risk for little reward. America wants to “decouple” from China and contain its economic rise, whatever happens to the balance of trade. Commercial relations between the two superpowers may be at a “cyclical” low—but they are also in secular decline.
Any gains China won through talks might then be whittled away over time. The country’s leaders also have a lot to lose if discussions go awry. No adviser to Xi Jinping, China’s ruler, would risk exposing him to the kind of public humiliation meted out to Volodymyr Zelensky, Ukraine’s president, in February. A trade war is bearable. An Oval Office circus is not.
If the two superpowers do continue to fight, who will back down first? Mr Trump inherited a stretched stockmarket, but a strong economy. America’s latest job figures beat forecasts; household balance-sheets are robust. The president has done his best to squander that legacy. Before the tariff delay, JPMorgan Chase, a bank, suggested America had a 60% chance of falling into a recession and a 40% chance of taking the world economy down with it.
Those odds have presumably dropped a bit. But the tariffs that remain will still raise prices, eroding household purchasing power and, possibly, delaying any interest-rate cuts from the Federal Reserve. For over a third of products that America buys abroad, China is the dominant supplier, meeting 70% or more of America’s foreign demand, according to Goldman Sachs, another bank. The trade war will more than double the price of these goods.
Even before inflation rises, uncertainty has spiked. And that can be equally damaging to investment and spending. A daily index of trade-policy uncertainty, calculated by Dario Caldara of the Federal Reserve and others, has been over twice as high as its previous record, reached during Mr Trump’s first trade war. The president’s supporters point out that tariffs have been a consistent preoccupation of his since the 1980s. But he seems to pursue uncertainty with equal conviction. He is a mercantilist, yes, but a mercurialist above all.