Competition Without Knowing What Winning Is

Picture of Eli Moshe

Eli Moshe

Contact with us

When I worked at NetApp, the company had already grown into a major player. At that point, the market referred to it as a “two-horse race” — EMC and NetApp. The competition was clear: market share in the data storage space. The goal was well defined — to dominate the world’s storage infrastructure. Later, the landscape shifted. New companies emerged with diverse topologies like software-defined storage, which allowed customers to install storage software on any server and create scalable grids. Suddenly, the competition wasn’t so obvious. With 7–9 serious contenders in the market, customers were choosing topologies before they chose vendors — software-based, server-based, SAN, NAS, etc.

After several sessions this week at Datacloud Global Congress 2025, hearing from cloud providers, chip makers, large data center operators, and system manufacturers, one thing is certain: We are on the verge of massive change, driven primarily by AI. Some competitions are clearly defined:

• Microchip companies racing to produce faster, more power-efficient GPUs that take up less space and cost less.

• Hardware vendors developing cooling systems for racks expected to reach 2MW each.

• Energy innovators pushing to power 5G-enabled server farms.

But when it comes to AI and the cloud, what exactly is the competition?

They are all heavily investing in the AI field, offering a wide range of services , products and platforms to support the development and management of AI .

They are provider of powerful infrastructure for AI such as GPU and storge.

They are all provide integration with foundation models and LLMs .

BUT WAHT IS THE COMPETITION ABPUT AND WHAT CAN BE CALLED A WIN?

Is there a direct rivalry between cloud providers in the AI space? Is there meaningful competition between cloud providers and data center operators? Does it even exist?

On one hand, cloud providers are the biggest customers of data center companies. We rely on them to stop building their own infrastructure and focus on their own (somewhat undefined) race. On the other hand, data center companies are now offering GPU-as-a-Service to meet a growing demand — as many enterprises are choosing to move away from costly cloud-based GPU usage back to dedicated data centers or even on-prem infrastructure. So far, cloud providers haven’t clearly defined what they’re competing on in the AI era, nor have they shared any coherent strategies. They seem to agree on just one thing: This is a three-horse race, and there will be only one long-term winner. For us — the data center industry — things are more straightforward:

• The demand will increase.

• The technological requirements are clear.

•The specific solutions each customer will need? Not yet clear — because the cloud providers and enterprise AI consumers haven’t yet defined their strategies.

That’s why we keep building — with maximum flexibility to adapt to whatever unfolds.

And one thing is very clear to us: We are taking the biggest risk. We’re building data centers at $30–40 million per megawatt purely based on the belief that demand will arrive.