Open Now
Open Now
Watch now

CoreWeave secures $221M investment for its leading GPU-focused cloud compute services

Fast-forward to today and CoreWeave provides access to over a dozen SKUs of Nvidia GPUs in the cloud, including H100s, A100s, A40s and RTX A6000s, for use cases like AI and machine learning, visual effects and rendering, batch processing and pixel streaming.

NYC-based startup CoreWeave has raised $221 million in a Series B funding round led by Magnetar Capital, with participation from Nvidia, ex-GitHub CEO Nat Friedman, and former Apple exec Daniel Gross. CoreWeave, which began as an Ethereum mining venture, is transitioning to a general-purpose cloud computing platform.

The investment, which values CoreWeave at $2 billion pre-money, brings the company’s total raised to $371 million, and will be used to support the opening of two new US-based data centres this year. CoreWeave provides access to over a dozen SKUs of Nvidia GPUs in the cloud, and is working with generative AI and open source AI and machine learning projects. The startup claims its hardware for serving AI models is industry-leading and able to autoscale within three seconds.

The tranche, which values CoreWeave at $2 billion pre-money and brings the company’s total raised to $371 million, will be used to support CoreWeave’s U.S.-based data center expansion with the opening of two new centers this year, CEO Mike Intrator said. CoreWeave currently operates five in North America.

CoreWeave was founded in 2017 by Intrator, Brian Venturo and Brannin McBee to address what they saw as “a void” in the cloud market. Venturo, a hobbyist Ethereum miner, cheaply acquired GPUs from insolvent cryptocurrency mining farms, choosing Nvidia hardware for the increased memory (hence Nvidia’s investment in CoreWeave, presumably).

Initially, CoreWeave was focused exclusively on cryptocurrency applications. But it pivoted within the last several years to general-purpose computing as well as generative AI technologies, like text-generating AI models.

Fast-forward to today and CoreWeave provides access to over a dozen SKUs of Nvidia GPUs in the cloud, including H100s, A100s, A40s and RTX A6000s, for use cases like AI and machine learning, visual effects and rendering, batch processing and pixel streaming.

“Our clients include generative AI companies, like Tarteel AI and Anlatan, the creators of NovelAI, and we’ve supported a range of open source AI and machine learning projects like EleutherAI and Stability AI’s Stable Diffusion,” Intrator told TechCrunch in an email interview. “We also work with a number of notable VFX and animation studios such as Spire Animation, and partner closely with 3D streaming and ‘metaverse’ companies such as PureWeb.”

It’s tough for any cloud provider to compete with the incumbents in the space — i.e., Google, Amazon and Microsoft. For perspective, AWS made $80.1 billion in revenue last year, while Google Cloud and Azure made $75.3 billion and $26.28 billion, respectively.

Those figures are multiples above CoreWeave’s valuation, obviously, let alone its war chest.

To drive the point home, according to a Statista report from the fourth quarter of 2022, AWS had a 32% market share, Azure had a 23% share and Google Cloud had a 10% share.

That’s not to say it’s impossible for a smaller player to succeed. There’s success stories like Paperspace, Scaleway and DigitalOcean (despite its ups and downs), as well as newer entrants like Clever Cloud and Vultr.

CoreWeave is evidence of this also, it’d seem. The startup managed to secure funding even coming off of a rough quarter for the cloud infrastructure market. As my colleague Ron Miller wrote, companies looked for ways to cut back on spending in an uncertain economy, slowing the market to 21% growth — a precipitous drop from the 36% growth in the year prior.

“We have over 1,000 customers across our four key verticals — machine learning and AI, batch processing, pixel streaming and visual effects and rendering,” Intrator said.

CoreWeave makes the case that the dominant cloud providers — Google Cloud, Azure and AWS — have failed to meet the demand for generative AI in particular with their “legacy cloud infrastructure.” Them’s fighting words, to be sure, especially as AWS launches a dedicated service for serving text-generating models. But in Intrator’s eyes, the incumbents aren’t set up to meet the demand of thousands of new AI companies clamoring for GPUs — at least not at CoreWeave’s (ostensibly lower) prices.

CoreWeave claims its hardware for inference — i.e., serving AI models — is industry leading, able to “autoscale” within three seconds. It also touts its newer instance products, which include Nvidia’s HGX H100 server platform.

“For a while now, technology decision-makers have faced the increasingly complex — and costly — task of deploying their highly specialized compute tasks supporting modern AI and machine learning applications to more generalized cloud computer providers,” Intrator said. “CoreWeave recognizes this demand will require deep investment in scalable and attainable capacity for the next generation of innovative AI firms.”

Beyond infrastructure, CoreWeave attempts to differentiate itself with offerings like its accelerator program, which launched in late October. (Intrator says it has over 30 members.) The accelerator — which operates on an open-ended basis, with no deadlines — provides companies compute credits in addition to discounts and other hardware resources on the CoreWeave cloud.

Intrator says that the new tranche will lead to more efforts like this.

“With the emergence of CoreWeave and this new investment, it can service more companies with even more customized solutions that can outperform legacy cloud providers,” he added. “While large language models and deep learning image generation technologies have been around for a while, their prominent place in the public eye is driving an intense scramble to secure processing power for ever more powerful applications. CoreWeave recognizes this demand will require deep investment in scalable and attainable capacity for the next generation of innovative AI firms.”

It’ll also be put toward expanding CoreWeave’s team. The company employs “just over” 115 people now — up 150% in the last 12 months — thanks in part to its acquisition of the cloud rendering platform Conductor Technologies in January, and Intrator says that the plan is to keep hiring “throughout the year.”

The question is, of course, whether CoreWeave can maintain its impressive momentum — particularly if the generative AI bubble bursts anytime soon. For what it’s worth, Friedman and Gross seem convinced by the strategy. They sent this statement via email:

AI is the new electricity, and CoreWeave is building the grid for the new economy. We’ve had the pleasure of working for Apple and Microsoft; investing in breakout companies like Stripe, Figma, and Airtable; and with that, we can confidently say that the tempo and pace that CoreWeave moves at is unprecedented. Every day is a sprint for victory, and it shows in the quality and quantity of their customers. AI inference demand is about to explode, and CoreWeave has spent years preparing the infrastructure and culture to scale for this moment.

There’s some reason for optimism. According to a recent survey by ESG, 59% of companies plan to spend more on public cloud apps in 2023 while 56% expect that their public cloud infrastructure services spending will increase.

Source: TechCrunch

Follow us on Google News

Filed under