Turbopuffer, a search infrastructure startup founded by Simon Hørup Eskildsen, is tackling a critical pain point for AI applications: the cost and complexity of retrieval systems powering semantic search and recommendation engines. The company emerged from Readwise's struggle to affordably implement article recommendations—what would have cost $20,000 monthly in vector search infrastructure for a company spending only $5,000 total on databases. Eskildsen's solution reimagines search infrastructure around cloud primitives like object storage and NVMe, avoiding traditional consensus layers that plague systems like Elasticsearch. The startup's approach reflects a fundamental belief about AI systems: while large language models can reason effectively, they cannot compress the world's knowledge into weights alone and must connect to external systems that hold information in full fidelity. Turbopuffer's architecture supports hybrid retrieval patterns—combining semantic, text, regex, and SQL-style queries—which Eskildsen argues are becoming more important as agentic AI systems replace single retrieval calls with parallel multi-query patterns. The company has already achieved significant wins, cutting Cursor's search costs by 95% while improving per-user economics, and is now reducing query pricing to accommodate the massive concurrent query volumes generated by agent-based workloads.