Cloudflare R2 Storage, the distributed object storage that eliminates egress costs, provides essential infrastructure for leading generative artificial intelligence (AI) companies. These partnerships help AI infrastructure companies avoid vendor lock-in and make training generative AI models accessible and affordable.
Generative AI relies on powerful GPUs for efficient parallel processing. Cloudflare Workers AI lets users deploy machine learning models on serverless GPUs within Cloudflare’s global network. As Generative AI startups training AI models face compute capacity shortages from cloud providers, Cloudflare introduces flexibility by using R2 as an object store for training data. This allows customers to choose the best GPU cloud provider based on capacity, cost, and performance without data transfer fees.
"At CoreWeave, we build infrastructure for compute intensive use cases, empowering businesses to transform how we engage with technology through Generative AI and LLMs. This specialization unlocks access to the scale and variety of GPUs that our clients require, on an infrastructure purpose-built for performance, agility, and efficiency that AI workloads rely on. By partnering with Cloudflare’s R2 Storage, we're able to further alleviate data lock-in driven by ballooning egress fees on the hyperscalers and empower multi-cloud for businesses that benefit from it in a meaningful way."
Max Hjelm
Vice President, Sales