Arista trumpeted its role in this project for a year. Good thing business is otherwise solid
Big Tech's plans to spend hundreds of billions on infrastructure during 2025 are often considered to demonstrate near-endless demand for artificial intelligence, but networking vendor Arista has just revealed a large AI build has "stalled" due to lack of funds and hardware.
This time last year, Arista revealed it had won five "large" deals to build networks underpinning AI infrastructure, four of them with its Ethernet products and one using InfiniBand. In May, CEO Jayshree Ullal told investors the four Ethernet deals were moving from trials to pilots, adding: "We expect production in the range of 10,000 to 100,000 GPUs in 2025."
In July the supplier offered a little more detail about the four Ethernet deals, which it described as involving "a large tier-two cloud provider," "a large data center customer," and "a large automotive manufacturer."
In November, the biz said three of the five big deals were "going well." Of the two others, one was "starting," and the other was "moving slower than we expected."
Of that fifth one, "they may get back on their feet," the chief exec said at the time. "In 2025, they're awaiting new GPUs, and they've got some challenges on power cooling, etc. We're a little bit stalled, which may be why we're being careful about predicting how they'll do. They may step in nicely in the second half of 2025."
It sounds like that customer didn't get the GPUs it needed.
On Tuesday this week, during Arista's earnings call with Wall Street to discuss its fourth quarter of 2024, Ullal revealed that the troubled customer, who she described as "not a cloud titan," remains "a little bit stalled."
I hope they'll come back next year. But for this year, we won't talk about them
"They are awaiting new GPUs and some funding, too, I think. I hope they'll come back next year. But for this year, we won't talk about them."
The identity of the stalled AI build isn't known, but Ullal had previously said two of the five big wins "could be classified as specialty providers." So maybe the mystery customer operates in a niche that somehow exempts it from investor ardor for AI.
Or maybe it just couldn't get to the head of the queue to acquire (say) Nvidia accelerators, and its backers took their cash off the table until GPUs become available.
Whatever happened, this is a rare example of an AI bust amid the current boom. We await signs of a wider malaise.
Arista's financial results for Q4 were otherwise upbeat.
Our sibling publication The Next Platform has covered them in detail here, but the headline figures were 6.6 percent quarterly revenue growth to $1.93 billion, and year-on-year growth of 25.3 percent from the fourth quarter of 2023; annual revenue growth of 19.5 percent to reach $7 billion; and full year net income rising 37 percent to $2.85 billion.
Microsoft and Meta now account for 20 percent and 14.6 percent of business, respectively, but are just two of over 10,000 customers who have installed a cumulative 100 million ports across Arista's life, we're told. "A lot" of customers are installing 400 and 800 gigabit Ethernet, a welcome shift to newer tech.
Senior veep and chief platform officer John McCool said the company has already worked to reduce its dependence on China, meaning tariffs shouldn't be a problem.
Ullal was pleased with recent AI-related revenue and offered her view that networkers like Arista will soon have more opportunities in the field.
"If you look at how we have classically approached GPUs and collective libraries ... we've largely looked at it as two separate building blocks. There's the vendor who provides the GPU, and then there's us who provides the scale-out networking," she said.
"But when you look at Stargate and projects like this, I think you'll start to see more of a vertical rack integration, where the processor, the scale-up, the scale-out, and all of the software to provide a single point of control and visibility starts to come more and more together."
The CEO doesn't think that will happen this year. "But definitely, in '26 and '27, you're going to see a new class of AI accelerators for a new class of training and inference, which is extremely different than the current, more pluggable Lego type of version," she said. That's good for Arista because those new accelerators will create more traffic, which means a need for 1.6T Ethernet.