Multiverse Computing raises $215M to compress AI models by 95%

Cosmico - Multiverse Computing raises $215M to compress AI models by 95%
Credit: Multiverse Computing

Spanish startup Multiverse Computing has raised a staggering €189 million (~$215 million) in a Series B round, driven by excitement around its breakthrough in artificial intelligence model compression — a technology it calls CompactifAI.

CompactifAI is a quantum-computing inspired compression method that can shrink large language models (LLMs) by up to 95% without sacrificing performance, the company claims. The result: leaner, faster models that are dramatically cheaper to run — and small enough to fit into laptops, mobile phones, drones, or even a Raspberry Pi.

Quantum Concepts, Classical Gains

Co-founded by Román Orús, a physics professor at the Donostia International Physics Center, Multiverse leverages tensor networks — mathematical structures originally developed for simulating quantum systems. These networks allow complex models to be represented in significantly smaller forms while preserving most of their functional capacity.

Multiverse applies this compression to popular open-source LLMs like Llama 4 Scout, Llama 3.3 70B, Llama 3.1 8B, and Mistral Small 3.1. A version of DeepSeek R1 is in the works, and the company is expanding its offerings to include more open-source and reasoning models. However, closed-source proprietary models from companies like OpenAI are not supported.

Slim Models, Big Performance

Multiverse's compressed models — branded as “Slim” — are available via Amazon Web Services or can be licensed for on-premise use. According to the company, they can deliver 4x to 12x faster inference, reducing costs by 50% to 80%.

For example, Llama 4 Scout Slim costs just $0.10 per million tokens on AWS, versus $0.14 for the uncompressed version. Beyond cloud deployment, the reduced footprint allows LLMs to run locally on constrained devices, a potential game-changer for edge AI.

Serious Science, Serious Backing

Multiverse isn’t just another AI startup riding the hype wave. Its technical foundation is robust, rooted in quantum physics and mathematics. Orús’ background in tensor networks and quantum computing provides a rare pedigree in AI compression research.

The company's CEO, Enrique Lizaso Olmos, is a former academic and seasoned banking executive, having served as deputy CEO at Unnim Bank. Together, they’ve built a team that bridges academia and industry.

Multiverse says it holds 160 patents and serves 100 customers globally, including Iberdrola, Bosch, and the Bank of Canada.

$215M Vote of Confidence

The Series B round was led by Bullhound Capital, an investor in tech giants like Spotify and Discord, and joined by HP Tech Ventures, SETT, Forgepoint Capital, CDP Venture Capital, Santander Climate VC, Toshiba, and Capital Riesgo de Euskadi.

With this round, Multiverse Computing’s total funding rises to about $250 million, placing it among the better-capitalized European AI startups — and reinforcing investor belief that smaller, faster, and cheaper is the future of AI.

As AI models get ever larger and more expensive to deploy, Multiverse’s bet is that smaller is smarter — and that the key to sustainable AI might just lie in quantum-inspired compression.

Read more