Inferact
-73%
est. 2Y upside i
Inferact's mission is to grow vLLM as the world's AI inference engine and accelerate AI progress by making inference cheaper and faster. Founded by the creators and core maintainers of vLLM, we sit at the intersection of models and hardware—a position that took years to build.
Rank
#359
Sector
AI Infrastructure, Machine Learning
Est. Liquidity
~6Y
Data Quality
Data: HighInferact operates in the massive and rapidly growing AI inference market (TAM $250B+, ~19% CAGR) with a strong competitive moat derived from its widely adopted open-source vLLM engine.
Last updated: February 25, 2026
Inferact's vLLM becomes the undisputed "universal inference layer", driving massive adoption across all major cloud providers and enterprises. Its paid serverless offering captures significant market share, pushing annual recurring revenue (ARR) to $500M+ by 2028. This market leadership and strong revenue growth in a $250B+ TAM justify a $4.0B+ valuation, representing a 5x return.
Inferact successfully commercializes vLLM, securing a strong position in the rapidly growing AI inference market. It maintains its open-source advantage and achieves steady enterprise adoption, growing ARR to $150M-$200M by 2028. This leads to an IPO or acquisition at a $1.2B-$1.5B valuation, representing a 1.5x-1.9x return.
Major cloud providers (AWS, Google Cloud) aggressively integrate their own optimized inference solutions, or a well-funded competitor like RadixArk gains significant traction, commoditizing vLLM's commercial offerings. Inferact struggles to differentiate, leading to stalled growth and a down round or acquisition at a significantly reduced valuation of $160M, wiping out most common stock value given the $150M liquidation preference.
Preference Stack Risk
highFunding Intensity
19%Investors hold $150M in liquidation preferences ahead of common stock.
Dilution Risk
highAs a Seed stage company, Inferact will likely undergo several more funding rounds, leading to significant dilution for early common stock holders.
Secondary Liquidity
noneAs a very early-stage company, there is currently no active secondary market or tender offers for Inferact's equity.
Research & Engineering — 5 roles
- Member of Technical Staff, Cloud Orchestration · San Francisco
- Member of Technical Staff, Exceptional Generalist (Remote) · Remote
- Member of Technical Staff, Inference · San Francisco
- +2 more →
Last updated: March 10, 2026
Questions to Ask at the Interview
Strategic questions based on Inferact's data — designed to show you've done your homework.
- 1
“Given the strong incumbent presence from major cloud providers like AWS and Google Cloud in the inference space, how does Inferact plan to maintain its competitive edge and ensure its "universal inference layer" strategy remains viable long-term?”
- 2
“Inferact is commercializing an open-source project. How will the company balance supporting the open-source vLLM community with developing proprietary features for its paid serverless offering, and what is the strategy for converting open-source users into paying customers?”
- 3
“With a recent $150M Seed round at an $800M valuation, what is the anticipated timeline for future funding rounds and a potential liquidity event (IPO or acquisition) for employees, and how does the company plan to manage potential dilution?”
Community
Valuation Sentiment
Our model estimates -73% upside. What do you think?
Anonymous. Do not share material non-public information.
Community Discussion
Comments are reviewed before they appear publicly.
Loading comments...
Disclaimer: This analysis is AI-generated and does not constitute financial or career advice. Always conduct your own due diligence.