
Inferact builds the world’s AI inference infrastructure, founded by the creators and core maintainers of vLLM, the most widely used open-source LLM inference engine. Their mission is to make AI inference dramatically faster and cheaper by advancing vLLM and turning the complexity of serving cutting-edge models across diverse hardware into effortless, scalable infrastructure.
We first partnered for their Seed in 2026.