April 21, 2023

The AI agent abstraction

Over the last six months, we’ve seen LLMs permeate the enterprise. It started with employees using ChatGPT to write emails, answer questions, generate copy, and summarize articles. Then, out of nowhere, customer support teams started using LLMs to triage tickets and developers were hooked on GitHub Copilot. Early-stage startup founders got the memo: “LLMs are core to what we do,” “we make tools that make building LLM apps easier,” “we make tools that help those making tools that make building LLM apps easier,” and so on. 

Over the next six months, I expect we’ll hear a lot more about “agents.” This is the next level-up for AI, perhaps the one we’ll look back on as having the most profound impact on our lives.

Matt Schlicht wrote a great piece that gets at the heart of what agents are: “Autonomous agents are programs, powered by AI, that when given an objective are able to create tasks for themselves, complete tasks, create new tasks, reprioritize their task list, complete the new top task, and loop until their objective is reached.”

Workflow automation isn’t new, but the recent Generative AI boom is breathing new life into the idea. At a recent fireside chat, Inflection AI’s co-founder, Mustafa Suleyman, remarked, “in the next few years, hundreds of millions of people will have intelligent companions that are with them 24/7. [They] will be your coach, therapist, negotiator, travel planner, teacher… We’ll get used to the idea of saying, ‘Hold on, let me ask my AI.’” Inflection AI has publicly announced $265M in funding, presumably to pursue this idea. Adept, a newly minted unicorn, recently closed on a $350M financing round to build “AI teammates for everyone.” Langchain, a popular open-source framework for building with LLMs, has already built integrations to agentic projects like BabyAGI, CAMEL, Generative Agents, AgentGPT, and AutoGPT. By the time you’re reading this article, AutoGPT will have surpassed 100K stars on GitHub, making it the fastest-growing open source repository ever. On top of this, according to TechCrunch, DeepMind, now Google DeepMind, “has explored an approach for teaching AI to…[complete] “instruction-following” computer tasks, such as booking a flight.” The agent race is on.

The winners of this race will put a fleet of agents in your pocket and on your desktop. Five prerequisites for getting this right include privacy preservation, on-device inference, agent-to-agent interaction, persistent memory, and quality controls.

  • Privacy preservation is important because agents will likely have access to vast amounts of personal data.
  • On-device inference is essential for cost-effectively scaling agents, and ensuring minimal latency. Also important here is ensuring agents retain a low-memory footprint.
  • Agent-to-agent interaction is crucial for creating a cohesive and efficient network of agents. If someone builds an agent for calendar actions, for example, it should seamlessly interact with someone else’s calendar agent, even if another party runs that agent. 
  • Persistent memory is highly important for agents to be useful. They should also get better over time and take increasingly personalized actions, sometimes in a proactive manner.
  • Quality controls—like hallucination guardrails (see Guardrails AI)—ensure that agents don’t generate false or misleading information.

Unless the above prerequisites are met, claims of agents changing your life should be taken with a fistful of salt. 

We believe agents represent the next great paradigm in AI. Language and diffusion models made us rethink everything from advertising, medical research, software design, and even what it means to be creative. Agents will extend this impact, redefining how we interact with technology and each other. A recent tweet by programming legend Kent Beck captures the sentiment well: “The value of 90% of my skills just dropped to $0. The leverage for the remaining 10% went up 1000x.”

At RunwayML’s inaugural AI film festival, we heard a quote that stuck with us: “Every abundance creates a new scarcity.” The task for builders is to figure out which resources AI makes abundant, which are rendered scarce, and where an edge can be formed. So far, for those who have found an edge, the business results are incredible. We’ve seen multiple startups achieve top decile growth while remaining profitable. Publicly available data shows that ChatGPT is the fastest-growing consumer product of all time. We’ve seen other non-public examples of similar growth rates in sectors like legaltech, customer support, and healthcare. Demand for this technology is unprecedented. Every type of organization, from SMBs to Enterprises, is clamoring to inject AI into their products and workflows. 

We’re approaching a “ChatGPT moment” for agents. Whether from one or a few companies, we expect personal digital assistants to reach millions of users in short order. If you’re building for this future, please reach out.