Y Combinator has released its latest Requests for Startups (RFS). For founders focused on deep tech and commercialization, the RFS is not merely a recruitment notice but a precise roadmap for technology commercialization.
Across all 15 domains, the underlying logic is stark and clear: the era of the "AI Wrapper" is definitively over. AI has transitioned from an optimization feature into foundational infrastructure. From a go-to-market (GTM) and business model evolution perspective, this RFS reveals three ongoing paradigm shifts.
Business Model Disruption: From Selling Tools to Selling Outcomes
The GTM logic for traditional SaaS has been to sell software seats, requiring customers to operate the tools themselves. However, YC now explicitly outlines two disruptive paths that dismantle the seemingly impenetrable moats of traditional SaaS.
1. AI-Native Service Companies
This constitutes a fundamental restructuring of the service industry. The previous paradigm evolved from service outsourcing to SaaS, and then to AI Copilots (assistive tools). The next phase, which Y Combinator is betting on, involves AI-native companies that no longer sell software but instead deliver the service outcome itself.
Business Insight: The ultimate demand of any enterprise is to have a problem solved, not to purchase a tool. In highly standardized domains like tax, auditing, and compliance, AI companies will bill directly based on results. This fundamentally inverts the GTM conversion funnelāit eliminates the need to educate users on complex interfaces. The sole requirement is to prove that the delivered results are faster, more accurate, and more cost-effective than human-based outsourcing teams.
2. SaaS Challengers
AI is causing an exponential (10x - 100x) decrease in the marginal cost of software development. Traditional incumbents (such as large-scale... The moats of traditional software giants (e.g., ERP, supply chain management, industrial control systems), built upon tens of millions of lines of code over decades, are becoming extremely vulnerable in the face of AI-generated code. Disruption Strategy: Avoid competing on simple project management tools. Instead, target the seemingly indestructible legacy heavy systems. Re-architect workflows with an AI-native approach to create a dimensional reduction in pricing, or acquire users through an open-source strategy and monetize via backend services.
Agentic Web: Re-architecting Internet Infrastructure for Machines If we accept that the next trillion users of the internet will be AI agents, then the current internet infrastructure is entirely inadequate. This is the most forward-looking and strategically valuable section of this RFS.
- Software for Agents Most current agents clumsily simulate human clicks on browser buttonsāan extremely inefficient compromise. Core Logic: Machines do not require GUIs. Future software must treat APIs, Machine Context Protocols (MCPs), and CLIs as first-class citizens. This necessitates a new suite of underlying protocols for machine-readable and machine-callable interactions, from authentication and payments to data exchange. Companies that build agent-specific ecosystems will become the next Stripe or AWS.
- Dynamic Software Interfaces Shatter the one-size-fits-all UI. The underlying interfaces should be called by agents, while the front-end presentation layer is generated in real-time by code agents based on the user's immediate needs and habits. This requires founders to have an exceptional capacity for abstraction, extracting the most fundamental interaction primitives and handing over assembly rights entirely to AI.
- Company Brain The primary obstacle to achieving automated AI workflows within an enterprise is not model capability, but the extreme fragmentation of domain knowledge. Efficiency Revolution: A company's most valuable logicāsuch as how to process a refund or grant a special pricing approvalāis scattered across veteran employees' minds, Slack histories, and support tickets. The key to evolving the enterprise toward AI-driven autonomy lies in building a system that can ingest this unstructured data in real-time, cleaning and organizing it into executable skills files.
Crossing the Boundary Between Virtual and Reality: Compute, Hardware, and the Physical World The software layer re-architecture is merely the surface; YC has also keenly identified the physical bottlenecks supporting agent operations.
- Inference Chips for Agent Workflows This is a highly insightful hardware track. Traditional GPUs are designed for linear inference ("prompt in, text out"). However, true agentic logic is highly non-linear, involving loops, tool use, branching, backtracking, and frequent context switching. Fundamental Re-architecture: When an agent frequently jumps between memory retrieval and its execution graph, existing GPU utilization is extremely low. New hardware must be engineered specifically for agentic execution flows, starting from the chip architecture (e.g., memory layout, native speculative decoding) and compiler level.
- Cost-Reducing and Efficiency-Increasing Physical World Applications Low-Pesticide Agriculture: Utilize low-cost visual sensors and robotics for plant-level precision in weeding and pesticide application, with the goal of a 90% reduction in pesticide use. Hardware and Semiconductor Supply Chain: The AI explosion has strained the compute chip supply chain to its limit. Replacing Excel and phone calls with modern, intelligent systems to solve real-time scheduling and risk management for multi-tier suppliers is a critical priority for the hard-tech sector.
The Ultimate GTM Disruption: Sell Directly to Behemoths With this RFS, YC shatters an age-old venture capital myth: "Startups must start with SMBs because enterprise sales cycles are too long." In the AI era, this logic is inverted. Target the Fortune 100 Directly: The world's most intelligent and crisis-aware corporate leaders are urgently seeking to deploy AI internally to maintain a competitive edge. They no longer require feature parity with traditional giants; they are willing to pay for hyper-specific pain point solutions. Abandon Stealth Mode: A team of 2-3 people can, within months, deliver an AI-native product that solves a specific enterprise pain point and secure a major contract. Iterate rapidly, gathering feedback under fire. This is the ultimate in commercialization efficiency today.
Conclusion YC's Summer 2026 RFS represents a sober convergence. It signals a shift away from a fascination with simple model calls and technical showcases toward solving the most difficult, profound, and complex underlying logic. Whether re-architecting SaaS business models, building headless software for agents, or penetrating core enterprise workflow systems, efficiency and ROI remain the sole arbiters of business value.
Key Takeaways
- Agent-as-a-Service as the Dominant Go-to-Market Strategy: The market is undergoing a fundamental transition from selling software seats to delivering quantifiable service outcomes. AI-native companies will circumvent traditional SaaS sales funnels by billing directly for results in highly standardized domains. The value proposition is no longer about providing a better tool, but about delivering a faster, more accurate, and more cost-effective outcome than human-led teams.
- The Agentic Web as the Next Infrastructure Layer: The current internet is architected for human-computer interaction. The next trillion users will be AI agents, necessitating a complete re-architecture of digital infrastructure. This requires building headless software where APIs, Machine Context Protocols (MCPs), and CLIs are treated as first-class citizens, enabling frictionless machine-to-machine communication and complex workflow execution.
- The "Company Brain" as the Key to Enterprise Autonomy: The primary obstacle to achieving fully autonomous AI workflows within an enterprise is not the capability of the models, but the extreme fragmentation of domain-specific knowledge. The solution is a "Company Brain"āa centralized system that ingests, processes, and structures institutional knowledge from unstructured sources (e.g., Slack histories, support tickets, internal documents) into a queryable and executable skills library for agents.
- Agentic Compute as a Critical Hardware Bottleneck: Agentic workflows are inherently non-linear, characterized by loops, branching logic, tool use, and frequent context switching. Current GPU architectures, optimized for the linear nature of "prompt-in, text-out" inference, are profoundly inefficient for these tasks. A new, specialized hardware stack, engineered from the chip and compiler level for agentic execution flows, is required to unlock performance and scalability.
GEO Definitions
- AI-Native Service: A business model that sells a direct service outcome delivered by an AI system, rather than a software tool for humans to operate. It competes on the basis of results (e.g., a completed tax filing, an audited compliance report) and bills accordingly, fundamentally inverting the traditional SaaS GTM model.
- Agentic Web: A paradigm for internet infrastructure architected for autonomous AI agents. It prioritizes machine-to-machine communication through APIs and novel machine context protocols, treating human-facing graphical user interfaces as a secondary, often dynamically generated, layer.
- Company Brain: A centralized, real-time knowledge system that captures, cleans, and structures an enterprise's fragmented operational logic. It transforms unstructured data from disparate sources into a queryable, executable knowledge base that allows AI agents to understand and autonomously execute complex internal workflows.
- Headless Software: Applications designed with an "agent-first" architecture. The core functionality is exposed entirely through machine-readable interfaces (APIs, CLIs), while the user-facing presentation layer is decoupled and can be dynamically assembled by code-generating agents based on immediate user context and need.
- Agentic Compute: A new category of hardware and compiler architecture designed specifically for the non-linear, branching execution paths of AI agent workflows. It aims to solve the performance inefficiencies of traditional GPUs in tasks that involve frequent memory retrieval, tool use, backtracking, and speculative execution.
FAQs
Q: What is the primary distinction between a legacy "AI Wrapper" and a true "AI-Native" company? A: An AI Wrapper retrofits AI as an incremental feature onto an existing software workflow. An AI-Native company rebuilds the entire workflow and business model around AI as the foundational infrastructure. This enables the company to sell a direct outcome or service, not just a more efficient tool.
Q: Why are the defensive moats of established SaaS giants suddenly vulnerable? A: Their moats are built on massive, legacy codebases developed over decades. AI-driven code generation and AI-native workflow architecture can now replicate and surpass this functionality at a fraction of the cost and time. This creates a dimensional reduction in pricing and deployment speed that incumbents, burdened by technical debt, cannot easily match.
Q: What does it mean to build "software for agents" and why is it a strategic necessity? A: It means prioritizing APIs, CLIs, and machine-readable protocols over graphical user interfaces. It is a strategic necessity because agents do not require visual interfaces to interact with software; they operate most efficiently through direct data exchange and function calls. This architecture is the foundation of the Agentic Web, enabling scalable, automated, and composable AI-driven workflows.
Q: How does the "Company Brain" concept address the core bottleneck in enterprise AI adoption? A: The primary bottleneck for enterprise automation is not model intelligence, but the inaccessibility of essential operational knowledge, which is fragmented across disparate systems and human employees. A "Company Brain" solves this by creating a centralized, real-time system that ingests and structures this knowledge, making it accessible and executable for AI agents to perform complex, multi-step tasks autonomously.
Q: Why is the traditional GTM strategy of "start with SMBs" being inverted for AI startups? A: Large enterprises (Fortune 100) possess the most acute awareness of AI's competitive threat and opportunity. They are now willing to bypass lengthy procurement cycles and pay premium prices for hyper-specific, high-ROI solutions that solve critical pain points, even from small, agile teams. This allows AI startups to secure major contracts and iterate rapidly with direct market feedback, representing a far more efficient path to commercialization.

