How Many More Data Centers Do We Need?

Welcome to the Real Estate Espresso Podcast, your morning shot of what’s new in the world of real estate investing. I’m your host, Victor Menasce.

Today, we’re looking at the evolution of computing, and there are some lessons that, quite frankly, have been forgotten, at least in my view.

When I first started my training in electrical engineering, I was using a large mainframe computer at the university. Most of the students had to book time in one of the terminal rooms to gain access to that mainframe. They were limited to 60 minutes at a time, in 30-minute chunks.

So I purchased my own terminal, which I had in my bedroom at home, and I got a second phone line, and it gave me unlimited access to the computer. I also purchased one of the first personal computers. It had a tiny screen, and it broke that link that tied me to the university mainframe.

Fast-forward to today, and we all know about the distributed nature of computing that has pushed the majority of processing literally into the palm of your hand. That doesn’t mean that data centers have disappeared. Today, those traditional data centers provide a lot of the background that cannot be practically processed on a personal device.

Now, if we look at the state of artificial intelligence today, the vast majority of computing is in fact being done in the data center.

Now, I believe the Apple strategy is worth further examination, because if Apple is correct, the demand for data centers is not going to be eliminated, but it will be reduced. Apple is betting that most user-facing AI should originate on the device, with the cloud as overflow for harder tasks. All of their new hardware is being hardware-embedded, even if the software can’t take full advantage of it today. That’s a smart strategy for Apple specifically.

It doesn’t imply that the AI data center becomes unimportant. It implies that the stack bifurcates: devices at the edge for cheap, private, frequent inference, and using the cloud for heavy, shared, or continuously improving intelligence. Apple has invested less in data centers because its economic objective is not necessarily to win the cloud layer outright. It’s to make the device the default place where AI happens, and to use the data center only when the device runs out of capacity.

Demand for new data centers may not rise in a straight line forever. In fact, one of the more practical ways to reduce demand is to move some of that intelligence back to the edge, right to the consumer device itself.

Today, we’re training people to believe that every AI query has to travel halfway around the world to a hyperscale server farm to get processed in a giant warehouse of GPUs and then come back with an answer. The model is convenient for the platform owner, but it’s not necessarily efficient for society or for the end user. It concentrates cost, power consumption, latency, infrastructure risk into a handful of very large facilities.

So if every new phone, laptop, tablet, pair of glasses, appliance, vehicle, security device is shipped with a capable neural engine, a meaningful percentage of inference can happen locally.

I mean, think about it. Even today, I buy two different types of security cameras: those that include AI and those that don’t. The processing for recognition of different types of features within an image is done on the camera itself. The camera determines whether the image is a car or a dog or a person. That’s not going to the video server. And for sure, that image is not being sent to a data center in Louisiana to be analyzed.

That means the simplest, most frequent tasks like voice recognition, translation, image enhancement, scheduling, personal automation, any of these routine tasks can happen on the device without a round trip into the cloud.

So it’s not going to eliminate data centers. No question, they’re going to still be essential for model training, for large-scale coordination, for enterprise workloads, and for storage. But not every problem requires a supercomputer. We should stop using a power plant to light a candle.

From an infrastructure standpoint, that matters. Data centers consume enormous capital, huge quantities of electricity, water for cooling in many cases, and they have long lead-time utility upgrades. If edge devices could absorb 20, 30, or 50% of routine AI processing, that pressure on new data centers would decrease materially.

A better design principle is distributed intelligence: put enough compute where the data is created. That reduces latency, it reduces bandwidth demand, it improves privacy, which is super important, and it creates a more resilient system overall. As with real estate, good system design is about placing the right capacity in the right location, not just building more because demand appears to be growing.

These workloads can segment, just like workloads amongst humans can segment. There are some tasks that you would absolutely entrust to a capable intern that don’t require the most experienced people in your organization. Sure, the company’s guru can do it, but that might not be the most appropriate use of their time. If the guru is the constraint, you could go out and try and get more gurus, or you could find a way to segment the workload to make the guru less of a bottleneck in the organization.

This is really the same thing. That’s how we should be thinking about data centers. These are the gurus. They will aggregate the collective wisdom in many specific domains. They will be the best pilots, the best race car drivers. They will be the legal scholars, the medical specialists. The deepest training will happen using the supercomputer, but the routine inference tasks can be done locally. And that’s going to reduce the demand on the gurus dramatically.

So when people tell you that we need a thousand new data centers in the United States, you might question the validity of that thesis.

As you think about that, have an awesome rest of your day. Go make some great things happen, and we’ll talk to you again tomorrow.

Stay connected and discover more about my work in real estate and by visiting and following me on various platforms:

Real Estate Espresso Podcast:

Y Street Capital: