Businesses and governments are looking for tools to run AI locally to reduce cloud infrastructure costs and build sovereign capabilities. Quadric, a chip IP startup founded by veterans of early Bitcoin mining company 21E6, is using its on-device inference technology to expand beyond cars and into laptops and industrial devices to drive that change.
That expansion is already paying off.
CEO Veerbhan Kheterpal (pictured above, center) told TechCrunch in an interview that Quadric’s licensing revenue has increased from about $4 million in 2024 to $15 million to $20 million in 2025. The San Francisco-based company, with offices in Pune, India, is targeting up to $35 million this year to build a loyalty-driven on-device AI business. This growth has fueled the company, which is now valued at between $270 million and $300 million, up from its Series B of about $100 million in 2022, Keterpal said.
It also helped attract investors to the company. Last week, Quadric announced a $30 million Series C round led by ACCELERATE Fund managed by BEENEXT Capital Management, bringing the total raised to $72 million. Ketelpal told TechCrunch that the increase comes as investors and chipmakers look for ways to push more AI workloads from centralized cloud infrastructure to devices and local servers.
From cars to everything
Quadric started in the automotive space where on-device AI can power real-time capabilities such as driver assistance. Ketelpal said the proliferation of transformer-based models in 2023 will bring inference into “everything” and has caused a radical business inflection over the past 18 months as more companies look to run AI locally and not rely on the cloud.
“Nvidia is a powerful platform for data center AI,” said Kheterpal. “We were looking at building a similar CUDA-like or programmable infrastructure for on-device AI.”
Unlike Nvidia, Quadric doesn’t manufacture the chips itself. Instead, it licenses programmable AI processor IP. This is what Keterpal described as a “blueprint” that customers can build into their own silicon, along with a software stack and toolchain to run models, including vision and voice, on the device.
tech crunch event
san francisco
|
October 13-15, 2026

The startup’s customers span printers, cars and AI laptops, including Japanese auto supplier Denso, which makes chips for Kyocera and Toyota cars. The first products based on Quadric’s technology are expected to ship this year, starting with laptops, Keterpal told TechCrunch.
Nevertheless, Quadric is now looking beyond traditional commercial deployment to markets exploring “sovereign AI” strategies to reduce dependence on U.S.-based infrastructure, Keterpal said. He further added that the startup is developing customers in India and Malaysia and counts Moglix CEO Rahul Garg as a strategic investor to help shape India’s “sovereignty” approach. Quadric employs nearly 70 people worldwide, including approximately 40 in the United States and 10 in India.
Ketelpal said the push is being driven by the rising costs of centralized AI infrastructure and the challenges many countries face in building hyperscale data centers, with increased interest in “distributed AI” setups that run inferences on laptops in the office or on small on-premises servers, rather than relying on cloud-based services for every query.
The World Economic Forum noted this shift in a recent article as AI inference moves away from purely centralized architectures and closer to the user. Similarly, EY said in a November report that sovereign AI approaches are gaining momentum as policymakers and industry groups push for domestic AI capabilities across compute, models and data, rather than relying entirely on foreign infrastructure.
The challenge for chipmakers is that AI models are evolving faster than the hardware design cycle, Ketelpal said. He argued that customers need programmable processor IP that can accommodate software updates, rather than requiring costly redesigns each time the architecture moves from previous vision-focused models to today’s transformer-based systems.
Quadric pitches itself as an alternative to chip vendors such as Qualcomm, which typically use their AI technology within their processors, and IP suppliers such as Synopsys and Cadence, which sell neural processing engine blocks. Keterpal said Qualcomm’s approach could lock customers into its own silicon, while engine blocks provided by traditional IP suppliers are difficult for many customers to program.
Quadric’s programmable approach allows customers to support new AI models through software updates rather than redesigning the hardware, an advantage in an industry where chip development can take years, but model architecture transitions can now be completed in months these days.
Still, Quadric is still in the early stages of construction, has only signed up a handful of customers so far, and much of its long-term profits depend on converting today’s licensing deals into high-volume shipments and recurring royalties.
