
RunPod
Founded Year
2022Stage
Seed VC | AliveTotal Raised
$20.25MLast Raised
$20M | 2 yrs agoMosaic Score The Mosaic Score is an algorithm that measures the overall financial health and market potential of private companies.
-63 points in the past 30 days
About RunPod
RunPod provides cloud-based graphics processing unit (GPU) computing services in the artificial intelligence (AI) sector. The company offers GPU instances, serverless deployment for AI workloads, and infrastructure for training and deploying AI models. RunPod's services include inference, AI model training, and processing compute-heavy tasks. It was founded in 2022 and is based in Moorestown, New Jersey.
Loading...
RunPod's Products & Differentiators
Serverless
Event-driven GPU compute that spins up in seconds, runs your container or model once, then disappears. No nodes to babysit, no over-provisioning.
Loading...
Expert Collections containing RunPod
Expert Collections are analyst-curated lists that highlight the companies you need to know in the most important technology spaces.
RunPod is included in 1 Expert Collection, including Artificial Intelligence (AI).
Artificial Intelligence (AI)
20,894 items
Latest RunPod News
Oct 16, 2025
Runpod Launches Public Endpoints for AI Model Access By Spencer Hulse Spencer Hulse has been verified by Muck Rack's editorial team Published on October 15, 2025 Runpod is a cloud platform specializing in GPU-based infrastructure for artificial intelligence applications. The company provides on-demand access to graphics processing units across dozens of global data center regions. Its services have attracted a growing user base by focusing on the needs of AI developers. By mid-2024, Runpod’s platform had over 100,000 developers using its GPU instances and serverless endpoints. (The user base has since tripled; Runpod reports more than 500,000 developers on the platform by mid-2025.) The startup is backed by major tech investors, including Intel Capital and Dell Technologies Capital, which co-led a $20 million seed funding round in 2024. Runpod’s main offerings prior to 2025 were GPU Cloud (allowing users to spin up GPU-powered virtual machines for training or other compute tasks) and Serverless deployment, which lets developers create scalable API endpoints for their AI models without managing servers. These features are designed to simplify AI development: a developer can launch a container with a chosen model and get a live API for inference in production within minutes. This focus on developer experience, making infrastructure setup so straightforward that users can “set and go,” has been central to Runpod’s strategy. According to co-founder and CEO Zhen Lu, enabling rapid iteration for developers is what “matters most” in unlocking value from AI projects. Public Endpoints Launch in 2025 On August 6, 2025, Runpod officially announced the launch of a new feature called Public Endpoints, expanding its platform to offer ready-to-use AI models accessible via API. Public Endpoints provide instant access to advanced AI models through simple HTTP requests, with an API playground available for testing through the Runpod Hub. This means that instead of requiring users to set up their own model servers, Runpod hosts a selection of popular models that developers can invoke on demand. Public Endpoints span multiple AI domains. ByteDance, the company behind TikTok, partnered with Runpod to showcase its generative video model Seedance 1.0 Pro running on the platform at the launch event. Another model from ByteDance, Seedream 3.0, provides advanced image generation and is also included in the endpoint lineup. The service features large language models and speech recognition tools as well: a 70-billion-parameter Llama 2 variant (dubbed Deep Cogito v2) is offered, and OpenAI’s Whisper speech-to-text model is among the available endpoints. All these endpoints are billed on a usage-based model, allowing developers to tap into powerful AI models without provisioning their own servers. Runpod even offered free credits during the launch period to encourage users to experiment with every available endpoint. Competition and Industry Context Runpod’s rollout of Public Endpoints comes amid a broader trend of specialized AI infrastructure providers emerging to meet the exploding demand for machine learning and generative AI workloads. Industry observers have noted that general-purpose cloud platforms often struggle with the latency and scaling needs of AI, which has led to a new wave of cloud services built specifically for AI tasks. Runpod is one of several startups in this space. Its competitors have been attracting significant investment: CoreWeave, a New Jersey-based GPU cloud provider, went public in March 2025 at $40/share and has since grown to ~$130/share. Another rival, Lambda Labs, raised $480 million at a $4 billion valuation to expand its AI-focused cloud platform. This influx of capital underscores the surging demand for GPU-as-a-service. One market analysis forecasts that the GPU cloud market will grow from about $3.3 billion in 2023 to $33.9 billion by 2032. In this competitive landscape, Runpod’s strategy has been to iterate quickly on features developers ask for, such as easier model deployment workflows, in order to stand out. The introduction of Public Endpoints in 2025 aligns with that approach, offering a library of ready-made AI models alongside the option for users to deploy their own. The move also shows how independent cloud platforms can partner with AI research labs (in this case, ByteDance’s) to bring advanced AI models to a wider user base. For developers and businesses, services like Runpod’s Public Endpoints are making advanced machine learning models more accessible as cloud APIs. Tags
RunPod Frequently Asked Questions (FAQ)
When was RunPod founded?
RunPod was founded in 2022.
Where is RunPod's headquarters?
RunPod's headquarters is located at 1181 Nixon Drive, Moorestown.
What is RunPod's latest funding round?
RunPod's latest funding round is Seed VC.
How much did RunPod raise?
RunPod raised a total of $20.25M.
Who are the investors of RunPod?
Investors of RunPod include Intel Capital, Dell Technologies Capital, Adam Lewis, Nat Friedman, Julien Chaumond and 4 more.
Who are RunPod's competitors?
Competitors of RunPod include InferX, Crusoe, Fal, DataCrunch, Lambda and 7 more.
What products does RunPod offer?
RunPod's products include Serverless and 2 more.
Loading...
Compare RunPod to Competitors

Aethir is a decentralized cloud computing infrastructure provider that offers access to GPUs for AI model training, fine-tuning, inference, and cloud gaming. The company primarily serves the artificial intelligence and gaming industries with its distributed cloud solutions. It was founded in 2021 and is based in Singapore, Singapore.

Lambda operates as a company that provides graphics processing unit (GPU) cloud computing services for artificial intelligence (AI) training and inference within the technology sector. Its offerings include on-demand and reserved cloud instances with NVIDIA GPU architectures, as well as private cloud solutions and orchestration tools for managing AI workloads. Lambda serves sectors that require high-performance computing for artificial intelligence, including technology firms, research institutions, and enterprises with AI initiatives. Lambda was formerly known as Lambda Labs. It was founded in 2012 and is based in San Jose, California.

SQream focuses on GPU-accelerated data processing within the technology sector, with an emphasis on big data analytics and machine learning. The company provides tools for processing large datasets, including SQL on GPU for complex queries and machine learning model training. It serves sectors such as finance, telecommunications, manufacturing, advertising, retail, and healthcare. It was founded in 2010 and is based in New York, New York.

Crusoe provides artificial intelligence (AI) cloud computing services in the tech industry. Its offerings include infrastructure for AI exploration, large-scale model training, and scalable AI inference. It's Crusoe Cloud is available for developers and enterprises looking for AI computing resources. It was founded in 2018 and is based in Denver, Colorado.
Radium provides enterprise artificial intelligence (AI) cloud solutions, focusing on the lifecycle of artificial intelligence development and deployment. Its offerings include a platform for training, fine-tuning, and deploying AI models, with features that support AI computing. The company's services are aimed at sectors that require AI capabilities, such as research institutions and enterprises interested in generative AI technologies. It was founded in 2019 and is based in Toronto, Canada.
The Cloud Minders specializes in scalable GPU cloud computing solutions for the AI and machine learning sectors. It offers supercomputing services designed to accelerate AI training and optimize AI inference, catering to innovators and researchers. The company was founded in 2021 and is based in Atlanta, Georgia.
Loading...