Menu

Published in AI & Technology

Building Data Centres in Space: This Smells Like a Bubble

Iqbal Abdullah
By Iqbal Abdullah
Founder and CEO Of LaLoka Labs
Building Data Centres in Space: This Smells Like a Bubble
Blue Origin has filed for 51,600 satellites. SpaceX wants a million. The AI power crunch is pushing data centres into orbit. From where I sit, running an AI company in Tokyo, this has the classic shape of a bubble. Small businesses should be looking at the ground, not the sky.

Space Was on the Morning News

On the morning of March 31st, I was watching NHK's morning news programme when a feature came on about US companies racing to build data centres in space.

My first reaction was "that is interesting." My second was "I know this smell."

What Is Happening

Let me lay out the facts.

  1. Blue Origin (Jeff Bezos's space company) filed with the US Federal Communications Commission (FCC) this month to deploy up to 51,600 satellites. The system would use solar power for computation and beam data back to the ground.

  2. SpaceX filed plans in January with the ITU (International Telecommunication Union) for up to one million satellites. They already have over 6,500 Starlink satellites in orbit.

  3. A semiconductor-backed startup called Lumen Orbit launched a test satellite carrying NVIDIA GPUs and demonstrated AI inference processing in space (TechCrunch). Google has also published research on space-based computing.

The backdrop is straightforward. The explosion in AI demand is driving a surge in data centre power consumption. According to the IEA's Electricity 2024 report, global data centre electricity consumption is projected to more than double from roughly 460 TWh in 2022 to over 1,000 TWh by 2030. The US already operates over 5,000 data centres. Japan has about 200. The scale difference is staggering.

To put 1,000 TWh in perspective: the entire Tokyo metropolitan area consumes about 65 TWh per year. That is roughly 15 years' worth of Tokyo's electricity. On a global household scale, it is equivalent to powering about 3 million homes for a year.

Even in the US, community opposition is growing. Residents are pushing back against noise, rising electricity costs and water usage, and states like Virginia and Georgia have seen local governments impose moratoriums on new data centre construction (Washington Post). Companies outside the US, including in China, are also exploring space-based data centres. The AI race is expanding to orbit. Literally.

The pitch from companies is simple: in space, solar power is available 24 hours a day, there is no land to fight over, and no neighbours to complain.

What This Smell Is

I am not going to sit here and claim AI does not deliver business value. I run an AI-powered competitive analysis and content generation service. Of course it does. I have written about the importance of data sovereignty and infrastructure investment in the context of Japan's AI strategy.

But space data centres have the classic pattern of a bubble. In 2000, during the dot-com boom, Global Crossing poured billions of dollars into laying undersea fibre optic cables, convinced that demand would catch up. It did not. The company went bankrupt. The structure is the same: massive infrastructure investment based on the conviction that demand will inevitably follow, and then it does not follow fast enough.

  1. Unproven technology, massive capital expenditure. The cost of launching and operating 51,600 satellites is astronomical (literally). Large-scale space-based computing has not been proven to work. And as DeepSeek demonstrated, you can build high-performance AI models without pouring billions into infrastructure. NVIDIA's stock dropped 17% when the market realised that.

  2. Physics is a wall. Even LEO (low Earth orbit) satellites add 20 to 40 milliseconds of latency compared to ground-based data centres. That rules out real-time AI inference. Space has no air, so heat dissipation relies entirely on radiation, which limits computing density. These are not engineering problems waiting for a clever solution. They are physical constraints.

  3. Space debris as an externalised cost. According to the ESA Space Debris Office, there are over 36,000 tracked debris objects in orbit and an estimated 130 million fragments larger than 1 mm. A future with over 100,000 satellites makes the Kessler syndrome a very real concern. None of the companies filing these applications are pricing this environmental cost into their business plans.

I previously wrote about Japan's AI potential and quoted Nvidia's CEO advising countries to "own your data, build your own AI factories." Of course Nvidia would say that. They are the one company winning big from this AI boom. But even so, your AI factory does not need to be in space.

Small Businesses Should Be Looking at the Ground

The disconnect between billions of dollars spent on orbital satellites and the reality of small businesses using AI tools that cost USD 200 to 500 a month is enormous. Not many people are even using AI tools daily yet. Before reaching for space, we have not finished the job on the ground.

As I noted in my write-up of the McKinsey report, 78% of companies that have adopted AI are already seeing results. Those results did not fall from the sky. They came from practical, grounded use of specific tools.

What matters for small businesses right now is not getting distracted by the infrastructure arms race.

  1. Use the tools within reach. Generative AI for content creation, competitive analysis, workflow automation. Tools that cost a few hundred dollars a month exist today. With a vibe coding approach, you can automate business processes without needing to write code yourself.

  2. Prepare for a correction. Between 2026 and 2027, large-scale infrastructure bets are the most likely to unravel first. When capital tightens, space data centre investments will be the first to stall. Meanwhile, the practical AI tools market will likely strengthen, because it is built on actual demand.

  3. Get your own data in order. Japan's National Institute of Informatics (NII) built llm-jp-3-172b-instruct3 in the open, with over 1,900 researchers collaborating through the GENIAC project. Japan's advantage is not giant infrastructure. It is high-quality data and community. As the evolution of local LLMs shows, you do not need space. You can run practical AI processing on your own PC.

Building data centres in space is technically fascinating. But assessing whether it makes business sense is a different question entirely.

I am on the ground. And there is no shortage of problems to solve down here.

Share this article

🚀 Powered by Kafkai