d-Matrix Debuts with $44 Million and an AI Solution

Chipbrain1

By: Mary Jander


A four-year-old startup named d-Matrix has scored $44 million in funding for a silicon solution tailored to massive artificial intelligence (AI) workloads. Investors include M12 (Microsoft’s venture fund) and Marvell Technology (Nasdaq: MRVL).

To understand the problem d-Matrix addresses calls for a bit of background in what AI experts call transformers, which are neural networking models used to enable AI inferencing. In contrast with older AI technologies, transformers are equipped to recognize deep contextual subtleties, such as those occurring when a sentence is translated from one language to another, or when facial recognition is applied to computer vision applications.

Transformers, introduced around 2017 by engineers at Google, represent a breakthrough in AI technology. But there’s a price to pay. To run properly, transformers require massive amounts of computation and memory, which in turn soak up lots of energy. To cope, companies such as NVIDIA (Nasdaq: NVDA) are applying significant resources to finding solutions that combine software and silicon to make transformer-based AI more efficient.

Enter d-Matrix, which has designed a way to significantly boost performance of AI inferencing without sacrificing energy or efficiency. Using a silicon platform comprising chiplets (tiny processing components designed to fit together in a single integrated circuit) and software, d-Matrix claims to improve AI performance anywhere from 5 to 100 times.

An AI Tool for the Composable Datacenter

There are a couple of innovations that power d-Matrix technology. First, the company’s platform uses digital in-memory computing (DIMC) to speed up transformer workloads. The company claims to have found a way to use static random access memory (SRAM) not just for memory but for computing.

“SRAM is digital memory, and we augment that with the ability to compute,” said d-Matrix CEO and co-founder Sid Sheth (ex-Inphi, NetLogic Microsystems, and Intel). “How do you make [SRAM] a computer, not just a storage element? That is essentially what we did.”

A related advancement is that the in-memory computing is all digital, whereas up to now similar in-memory computing solutions have typically required some measure of analog processing, which lowers efficiency. “We’ve developed a path-breaking compute architecture that is all-digital, making it practical to implement while advancing the AI compute efficiency far past the memory wall it has hit today,” said CEO Sheth.

These technical features are aimed squarely at the datacenter, an area that Sheth and d-Matrix co-founder and CTO Sudeep Bhoja (ex-Inphi, Broadcom, Big Bear Networks) believe is underserved. There is a need, they say, for AI inference processing among enterprises that deploy so-called composable datacenters, which rely on software-defined hardware elements for flexibility and scalability.

Source: d-Matrix

No Product Yet

d-Matrix’s platform variations remain proofs of concept. But potential customers include cloud hyperscalers, which use transformers and inference AI for search and content moderation.

There is also demand among telcos for sophisticated subscriber-service systems for emerging 5G services, said the d-Matrix founders. And traditional enterprise customers of companies such as Hewlett Packard Enterprise (NYSE: HPE), Dell (NYSE: DELL), IBM (NYSE: IBM), and Cisco (Nasdaq: CSCO) could eventually add d-Matrix products to their datacenter wares, Sheth said.

d-Matrix expects to launch its first product, code-named Corsair, in the second half of 2023. Meanwhile, the startup continues work on its two foundation prototypes, nicknamed Nighthawk and Jayhawk, which provide the basic in-memory computing platform as well as a custom interconnect for d-Matrix chiplets.

Timing will be essential to success. The company faces competition from NVIDIA, Qualcomm (Nasdaq: QCOM), and several startups, including Groq and Tenstorrent, among others.

d-Matrix, founded in 2019, is headquartered in Santa Clara, Calif., with offices in Sydney, Australia, and Bangalore, India. It has roughly 50 employees. The fresh funding from Playground Global, M12 (Microsoft’s venture fund), SK Hynix, Nautilus Venture Partners, Marvell Technology, and Entrada Ventures will be used in part to grow the team as well as fulfill product plans.

Bottom line? With a cutting-edge technology in an area where demand is growing, d-Matrix is a company to watch.