Founded Year

2019

Stage

Series C | Alive

Total Raised

$429M

Valuation

$0000 

Last Raised

$275M | 8 days ago

Mosaic Score
The Mosaic Score is an algorithm that measures the overall financial health and market potential of private companies.

+123 points in the past 30 days

About d-Matrix

d-Matrix is focused on developing computing platforms for datacenter scale, with a particular emphasis on AI inference technology. The company offers Corsair™, a solution on improving performance for AI applications. d-Matrix serves sectors that require large-scale processing capabilities. It was founded in 2019 and is based in Santa Clara, California.

Headquarters Location

5201 Great America Parkway Suite 300

Santa Clara, California, 95054,

United States

888-244-1173

Loading...

ESPs containing d-Matrix

The ESP matrix leverages data and analyst insight to identify and rank leading companies in a given technology landscape.

EXECUTION STRENGTH ➡MARKET STRENGTH ➡LEADERHIGHFLIEROUTPERFORMERCHALLENGER
Enterprise Tech / Semiconductors & HPC

The compute-in-memory (CIM) market (also called processor-in-memor, in-compute memory, or computational memory) combines data storage and computational processing within the same memory unit. Instead of the traditional model where data is transferred between separate processing and storage units, CIM performs computations where the data is stored. These solutions significantly reduce data movement…

d-Matrix named as Leader among 15 other companies, including IBM, Cerebras, and Hazelcast.

d-Matrix's Products & Differentiators

    d-Matrix Corsair

    d-Matrix Corsair is an AI inference accelerator platform that runs Generative AI inference significantly faster with better Perf-TCO compared to alternative solutions

Loading...

Research containing d-Matrix

Get data-driven expert analysis from the CB Insights Intelligence Unit.

CB Insights Intelligence Analysts have mentioned d-Matrix in 4 CB Insights research briefs, most recently on Jan 28, 2025.

Expert Collections containing d-Matrix

Expert Collections are analyst-curated lists that highlight the companies you need to know in the most important technology spaces.

d-Matrix is included in 2 Expert Collections, including Semiconductors, Chips, and Advanced Electronics.

S

Semiconductors, Chips, and Advanced Electronics

7,494 items

Companies in the semiconductors & HPC space, including integrated device manufacturers (IDMs), fabless firms, semiconductor production equipment manufacturers, electronic design automation (EDA), advanced semiconductor material companies, and more

A

Artificial Intelligence (AI)

20,630 items

d-Matrix Patents

d-Matrix has filed 12 patents.

The 3 most popular patent topics include:

  • artificial neural networks
  • natural language processing
  • parallel computing
patents chart

Application Date

Grant Date

Title

Related Topics

Status

10/24/2023

4/8/2025

Natural language processing, Computational linguistics, Artificial neural networks, Instruction processing, Parallel computing

Grant

Application Date

10/24/2023

Grant Date

4/8/2025

Title

Related Topics

Natural language processing, Computational linguistics, Artificial neural networks, Instruction processing, Parallel computing

Status

Grant

Latest d-Matrix News

d-Matrix Raises $275 Million to Power the Age of AI Inference

Nov 13, 2025

Series C led by global consortium values company at $2 billion, accelerates product and customer expansion as demand grows for faster, more efficient data center inference SANTA CLARA, Calif., November 12, 2025-- d-Matrix, the pioneer in generative AI inference compute for data centers, has closed $275 million in Series C funding, valuing the company at $2 billion and bringing the total raised to date to $450 million. The new capital will advance the company's roadmap, accelerate global expansion and support multiple large-scale deployments of the world's highest performing, most efficient data center inference platform for hyperscale, enterprise, and sovereign customers. The oversubscribed round attracted leading investment firms across Europe, North America, Asia, and the Middle East. The funding is co-led by a global consortium including BullhoundCapital, Triatomic Capital, and Temasek. The round also includes new participation from the Qatar Investment Authority (QIA) and EDBI, alongside follow-on participation from M12, Microsoft's Venture Fund, as well as Nautilus Venture Partners, Industry Ventures, and Mirae Asset. d-Matrix's full-stack inference platform combines breakthrough compute-memory integration, high-speed networking, and inference-optimized software to deliver 10× faster performance, 3× lower cost, and 3–5× better energy efficiency than GPU-based systems. Solutions powered by d-Matrix's Corsair™ inference accelerators, JetStream™ NICs and Aviator™ software can produce up to 30K tokens per second at 2ms per token on a Llama 70B model. The platform's compute-dense design allows customers to run up to 100B-parameter models incredibly fast in a single rack. This step-change in performance and efficiency directly addresses growing AI sustainability challenges. By enabling one data center to handle the workload of ten, d-Matrix offers a clear path to reducing global data center energy consumption while enabling enterprises to deliver cost-efficient, profitable AI services without compromise. "From day one, d-Matrix has been uniquely focused on inference. When we started d-Matrix six years ago, training was seen as AI's biggest challenge, but we knew that a new set of challenges would be coming soon," said Sid Sheth, CEO and co-founder of d-Matrix. "We predicted that when trained models needed to run continuously at scale, the infrastructure wouldn't be ready. We've spent the last six years building the solution: a fundamentally new architecture that enables AI to operate everywhere, all the time. This funding validates that vision as the industry enters the Age of AI Inference." Investor confidence reflects d-Matrix's differentiated technology, rapid customer growth, and expanding network of global partners — including the recently announced d-Matrix SquadRack™ open standards-based reference architecture with Arista, Broadcom, and Supermicro. A strong product roadmap featuring 3D memory-stacking innovations and a customer-centric go-to-market strategy further establishes d-Matrix as a cornerstone of the new AI infrastructure stack. Investor Voices "As the AI industry's focus shifts from training to large-scale inference, the winners will be those who anticipated this transition early and built for it," said Per Roman, Founder of BullhoundCapital. "d-Matrix stands out not only for its technical depth but for its clear strategic vision. The team understood before anyone else that inference would define the economics of AI — and they're executing brilliantly on that insight." "AI inference is becoming the dominant cost in production AI systems, and d-Matrix has cracked the code on delivering both performance and sustainable economics at scale," said Jeff Huber, General Partner at Triatomic Capital. "Their digital in-memory compute architecture is purpose-built for low-latency, high-throughput inference workloads that matter most. With Sid, Sudeep, and their world-class team, plus an exceptional ecosystem of partners, d-Matrix is redefining what's economically possible in AI infrastructure." "The explosion in AI inference demand shows us that efficiency and scalability can be key contributors to revenue capture and profitability for hyperscalers and AI factories," said Michael Stewart, Managing Partner at M12, Microsoft's Venture Fund. "d-Matrix is the first AI chip startup to address contemporary unit economics in LLM inference for models of a range of sizes that are growing the fastest, with differentiated elements in the in-memory product architecture that will sustain the TCO benefits with leading latency and throughput." Morgan Stanley served as the exclusive placement agent, and Wilson Sonsini Goodrich & Rosati served as legal counsel to d-Matrix. Key Facts • Founded: 2019 | HQ: Santa Clara, CA • Global Offices: Toronto (Canada); Sydney (Australia); Bangalore (India); Belgrade (Serbia) • Founders: Sid Sheth (CEO), Sudeep Bhoja (CTO) • Core Products: Corsair inference accelerators, JetStream networking accelerators, Aviator software stack • Employees: 250+ worldwide • Series C Funding: $275 million | Total Funding: $450 million | Valuation: $2B About d-Matrix d-Matrix is pioneering accelerated computing for AI inference, breaking through the limits of latency, cost and energy. Its Corsair accelerators, JetStream networking, and Aviator software deliver fast, sustainable AI inference at data center scale. The terms d-Matrix, JetStream, Corsair and Aviator are trademarks and/or registered trademarks of d-Matrix, Inc. in the U.S. and other countries. All rights reserved. VCPro Database 2025 2025 New Edition Available! (Updated July 2025) Price: $119.5 (including a free update in January 2026) Discover venture capital effortlessly with the affordable VCPro Database , the top directory for venture capital and private equity.

d-Matrix Frequently Asked Questions (FAQ)

  • When was d-Matrix founded?

    d-Matrix was founded in 2019.

  • Where is d-Matrix's headquarters?

    d-Matrix's headquarters is located at 5201 Great America Parkway, Santa Clara.

  • What is d-Matrix's latest funding round?

    d-Matrix's latest funding round is Series C.

  • How much did d-Matrix raise?

    d-Matrix raised a total of $429M.

  • Who are the investors of d-Matrix?

    Investors of d-Matrix include Nautilus Venture Partners, M12, Triatomic Capital, Industry Ventures, Temasek and 25 more.

  • Who are d-Matrix's competitors?

    Competitors of d-Matrix include EdgeCortix, Deepwave, Groq, FuriosaAI, Positron and 7 more.

  • What products does d-Matrix offer?

    d-Matrix's products include d-Matrix Corsair.

Loading...

Compare d-Matrix to Competitors

FuriosaAI Logo
FuriosaAI

FuriosaAI focuses on designing AI accelerators for the AI and computing industry. The company provides AI accelerators for large language models, multimodal applications, and computer vision, with a focus on AI inference performance in data centers. FuriosaAI serves sectors that require AI processing capabilities, including the enterprise and cloud computing industries. It was founded in 2017 and is based in Seoul, South Korea.

Groq Logo
Groq

Groq specializes operates as an AI inference technology within the semiconductor and cloud computing sectors. The company provides computation services for AI models, ensuring compatibility and efficiency for various applications. Groq's products are designed for both cloud and on-premises AI solutions. It was founded in 2016 and is based in Mountain View, California.

Tenstorrent Logo
Tenstorrent

Tenstorrent is a computing company specializing in hardware focused on artificial intelligence (AI) within the technology sector. The company offers computing systems for the development and testing of AI models, including desktop workstations and rack-mounted servers powered by its Wormhole processors. Tenstorrent also provides an open-source software platform, TT-Metalium, for customers to customize and run AI models. It was founded in 2016 and is based in Toronto, Canada.

Mythic Logo
Mythic

Mythic is an analog computing company that specializes in artificial intelligence (AI) acceleration technology. Its products include the M1076 Analog Matrix Processor and M.2 key cards, which provide power-efficient AI inference for edge devices and servers. Mythic primarily serves sectors that require real-time analytics and data throughput, such as smarter cities and spaces, drones and aerospace, and augmented reality (AR) or virtual reality (VR) applications. Mythic was formerly known as Isocline Engineering. It was founded in 2012 and is based in Austin, Texas.

Hailo Logo
Hailo

Hailo develops AI processors for edge devices within the artificial intelligence and semiconductor industries. The company offers AI accelerators and vision processors for deep learning applications, image enhancement, and video analytics on edge devices. Hailo's products serve sectors such as automotive, security, industrial automation, and retail. It was founded in 2017 and is based in Tel Aviv, Israel.

F
Fractile

Fractile develops chips for AI model inference, focusing on large language models. The company aims to reduce the time and cost of processing by addressing bottlenecks in existing hardware through the integration of computation and memory. Fractile was formerly known as Neu Edge. It was founded in 2022 and is based in Newbury, United Kingdom.

Loading...

CBI websites generally use certain cookies to enable better interactions with our sites and services. Use of these cookies, which may be stored on your device, permits us to improve and customize your experience. You can read more about your cookie choices at our privacy policy here. By continuing to use this site you are consenting to these choices.