Off-Prem

PaaS + IaaS

Lambda on the hunt for 'another $800M' to fuel its GPU cloud

Why sell shovels when you can rent them


In the AI gold rush, if you can't be the one selling the GPUs then the next best thing could be to rent them. This week, we learned that Lambda is seeking $800 million in funding to do just that.

Founded in 2012, San Jose-based Lambda is no stranger to accelerated computing, having got its start building systems specifically for machine-learning R&D. It later expanded into colocation services before launching a GPU cloud service in 2018.

According to a Financial Times report, citing people familiar with the matter, term sheets for the impending funding round are expected by mid-July and JPMorgan is helping to coordinate the affair. The capital would be used to purchase additional Nvidia GPUs and associated network infrastructure and software, and to hire staff.

Leasing large quantities of GPUs, particularly for those training custom models, has become a lucrative business in recent years. As our sibling site The Next Platform recently determined, a cluster of 16,000 H100s costing $1.5 billion to purchase, deploy, and network would generate roughly $5.27 billion in revenue over the course of four years.

Of course, to play this game you still need a lot of capital to buy the GPUs in the first place — something that, so far, Lambda has had no shortage of success doing. Back in April, it secured half a billion dollars in debt financing to purchase "tens of thousands" of Nvidia's fastest accelerators, which served as collateral for the loan.

The debt financing came on top of the $320 million series-C funding round it announced back in February, the majority of which is also going toward Nvidia GPUs. The Register reached out to Lambda for comment regarding the reported funding round.

While $800 million may sound like a lot of capital for an upstart biz, it's far from the biggest figure we've seen in this field lately. Many AI outfits have seen their valuations skyrocket as hype over neural networks reaches new heights.

In May, CoreWeave, another bit barn peddling cut-rate GPU rentals, scored $1.1 billion in a series-C round. That same month, the cloud provider used its enormous collection of GPUs as collateral for a $7.5 billion loan backed by Blackstone, BlackRock, and others.

Meanwhile, similar operations such as Voltage Park and TensorWave have looked to recreate Lambda and CoreWeave's successes. However, it's not just AI infrastructure vendors that have seen their valuations take off.

Back in May, so-called data foundry Scale AI, which provides high-quality datasets used in AI training, saw its valuation touch $14 billion after VC firms and AI heavyweights, such as Nvidia, Amazon, and Meta, injected $1 billion into the firm.

Some feel we're heading into bubble-bursting territory. That's either wishful thinking – because the hype may only just be getting started – or a shallow acknowledgment that all good things eventually come to an end. Nvidia's market cap dipped $500 million last week and that was largely inconsequential: The GPU titan's stock price is holding steady today, being up nearly eight percent this past month and up more than 150 percent in the past half-year, and its market cap is sitting pretty north of $3 trillion. ®

Send us news
3 Comments

Where does Microsoft's NPU obsession leave Nvidia's AI PC ambitions?

While Microsoft pushes AI PC experiences, Nvidia is busy wooing developers

AI frenzy continues as Macquarie commits up to $5B for Applied Digital datacenters

Bubble? What bubble?

CoreWeave drops £1bn in UK datacenters – but don't expect the latest Nvidia magic just yet

Rent-a-GPU outfit's latest datacenters are packed to the brim with H200s

Nvidia shovels $500M into Israeli boffinry supercomputer

System to feature hundreds of liquid-cooled Blackwell systems

Nvidia snaps back at Biden's 'innovation-killing' AI chip export restrictions

'New rule threatens to squander America's hard-won technological advantage' says GPU supremo

Nvidia shrinks Grace-Blackwell Superchip to power $3K mini PC

Tuned for running chunky models on the desktop with 128GB of RAM, custom Ubuntu

With AI boom in full force, 2024 datacenter deals reach $57B record

Fewer giant contracts, but many more smaller ones, in bit barn feeding frenzy

Additional Microprocessors Decoded: Quick guide to what AMD is flinging out next for AI PCs, gamers, business

Plus: A peek at Nvidia's latest hype

AI hype led to an enterprise datacenter spending binge in 2024 that won't last

GPUs and generative AI systems so hot right now... yet 'long-term trend remains,' says analyst

Just as your LLM once again goes off the rails, Cisco, Nvidia are at the door smiling

Some of you have apparently already botched chatbots or allowed ‘shadow AI’ to creep in

AI datacenters putting zero emissions promises out of reach

Plus: Bit barns' demand for water, land, and power could breed 'growing opposition' from residents

UK unveils plans to mainline AI into the veins of the nation

Government adopts all 50 venture capitalist recommendations but leaves datacenter energy puzzle unsolved