We’re all used to Elon Musk’s big dreams for his companies. Tesla is transforming the world into sustainable energy and real-world artificial intelligence in the form humanoid robots and driverless vehicles. SpaceX is attempting to make humanity multi-planetary with Mars, and Neuralink aims to revolutionise healthcare through a generalised mind interface.
Musk’s latest posts are on another level. Musk says he has been thinking about the fastest way to bring one terawatt worth of computing online.
To get a sense of the size of this proposal, the current global compute capacity is likely between 1 and 10 zettaFLOPS (1021 and 1022 FLOPS). This is dominated by the data centers, cloud services, and AI infrastructure located in the U.S.A., China, Europe.
If you assume 1011 to1012 FLOPS/W then a terawatt of computing translates into 1023 to1024 FLOPS. (100 zettaFLOPS – 1 yottaFLOPS). This is 10 to 1,000 times the estimated global computing capacity of 1 to 10 ZettaFLOPS (1021-1022 FLOPS).
It would require a massive expansion of power infrastructure, equivalent in 20x the current data center power, or 1% global electricity. This is a great exercise for today, but not a practical reality.
Musk provided an estimate that suggested the power required would be equivalent to all the power produced in America currently. Grok’s help suggests that this estimate may be a bit low.
One terawatt compute requires 1 TW power, which produces 1023 to1024 FLOPS. (100 zettaFLOPS up to 1 yottaFLOPS). The U.S. will generate 4,200 tWh/year by 2025, which is equivalent to a power of 0.48 tW.
1 TW is 2.1 times more than the average U.S. electrical power output, and 77% of its installed capacity (1.3TW).
I’ve been thinking about the fastest possible way to bring a Terawatt of computing online.
This is roughly equivalent to the electrical power produced by America today.
— Gorklon Rust (@elonmusk). Musk’s pinned post for May 12, 2025,
is a response to a Jesse Peltan post that appears to have triggered Musk to think in this way. Musk says that the world will need a lot more solar power to achieve this.
Energy harnessed could increase by a billionfold with space solar power if we reach Kardashev II. Another billionfold would be possible if our galaxy’s energy is harnessed.
The Kardashev scale, introduced by Nikolai Kardashev, measures a civilisation’s technological progress by its energy consumption. Type I harnesses planetary energy (4×1012 W), while Type II uses stellar energy (4×1026 W) and Type III uses galactic energy ( 4×1037 W).
This post envisions humanity moving from Type I to type III, increasing energy consumption by a trillionfold using solar and space-based technology.
Then energy harnessed will increase perhaps a billionfold if we make it to Kardashev II, with space solar power, and another… https://t.co/0cHovopB9l
— gorklon rust (@elonmusk) May 12, 2025
Then energy harnessed will increase perhaps a billionfold if we make it to Kardashev II, with space solar power, and another… https://t.co/0cHovopB9l
— gorklon rust (@elonmusk) May 12, 2025
Musk provides another reference to help our tiny brains get around this ridiculously large figure, suggesting that the energy produced by around 10 Starships would be similar to energy level required to deliver 1 Terawatt of computational power.
We’re not thinking about what you would do with all that compute power. AI is a hungry beast and, as of yet, its ability to improve doesn’t seem to be affected by scaling laws. The sophistication and capabilities of AI are ultimately limited by your budget (and power). Imagine a network with large-scale data centers strategically located around the world, perhaps in areas that have abundant renewable energy sources such as solar or geothermal. What would it cost you? Let’s imagine for a moment what it would cost to achieve 1TW of power.
Operating 1 terawatt (1023 to1024 FLOPS), costs between $7.4 trillion and $12.9 trillion annuallywith a middle point of $10 billion/year. Electricity: $700 billion to $114 trillion/year (midpoint, $911 billion, assuming a PUE of 1.3 and $0.08/kWh).
Let’s assume that the compute is GPU based (e.g. NVIDIA H100 in 2025 or equivalent). A H100 costs $30,000 and consumes 700W (2023 price; assume $25k in 2025 due market scaling).
To create 1 TW, you would need around 1,43 billion graphics cards (1012 W/700 W). This is a 1000x increase in Nvidia’s largest purchases, which are measured in the hundreds and thousands. It is fun to think about but not practical.
Sometimes it’s fun to imagine.