vmExit is a computing environment where an AI agent builds applications from natural language. No pre-installed software. You describe what you need, the agent writes it, and it appears on screen.
Currently in research phaseIn the 1970s, a radical idea emerged: the computer should be a live, malleable environment that you reshape while you use it. No compilation step. No restart. You talk to the machine, it responds, you change it, it adapts.
Alan Kay's team at Xerox PARC built a system where everything — windows, menus, the compiler — was an object you could inspect and change at runtime. The “image” captured your entire world. You shaped the system by talking to it.
Bernie Greenberg's Multics Emacs brought Lisp into the editor — a live environment where users rewrite keybindings, rendering, build email clients, all without restarting. Later, GNU Emacs made this the standard. The system is never finished; it evolves with its user.
Symbolics and LMI commercialized the MIT Lisp Machine — computers where the REPL was the operating system. Every function, data structure, and pixel on screen could be inspected and rewritten while running. The boundary between using and programming didn't exist.
These systems were extraordinary, but they had a fatal constraint: you had to be a programmer to use them. Reshaping a Lisp Machine required understanding Lisp. Extending Emacs meant writing Elisp. Smalltalk demanded fluency in objects and messages. Interactive computing was powerful, but it was only for the few who could write code.
In the 1980s, Apple and Microsoft took a different path. They replaced the live, malleable environment with fixed applications built by professional developers and consumed by everyone else. Computing became accessible to billions — but the ability to shape your own tools was lost. You could use software, but you couldn't change it.
What if AI had existed in the 1970s? What if the barrier to interactive computing had never been programming skill, but simply describing what you want? The Lisp Machine REPL, but where you speak English instead of Lisp. Emacs, but where the system rewrites itself from a sentence instead of an expression.
That's what vmExit explores.
A live, malleable computing environment — but where the entity modifying it understands natural language. The computer ships empty. You describe what you need. The AI writes it. And it stays inside the application as a runtime capability, not just a build tool.
name: "Calorie Tracker",
data: { meals: [...], dailyTarget: 2500 },
render: (data, lib) => `
<div class="tracker">
<h1>${data.meals.reduce((s,m) => s + m.cal, 0)} kcal</h1>
${data.meals.map(m => `<div>${m.desc}: ${m.cal}</div>`).join('')}
<input data-field="mealInput" />
<button data-action="addMeal">+</button>
</div>
`,
handlers: {
addMeal: (data, payload, lib) => ({
...data,
meals: [...data.meals, { desc: data.mealInput, cal: 0 }],
mealInput: ""
})
}
Every application is a single object containing its data, a render function that produces HTML, event handlers that update the data, and styles. The agent creates and modifies these objects. The runtime takes care of turning them into a working UI.
The AI session starts when you open the page and remains alive as long as you're connected. There's no cold start between messages. The agent remembers what it built, what you asked for, and what went wrong. It's a continuous collaboration, not a series of isolated requests.
When something breaks — a bad render function, a handler that throws — the runtime captures the error and sends it back to the agent automatically. The agent reads the broken code, understands the problem, and patches just the part that failed. You see a brief error message, then the app comes back.
State saves automatically. Close the tab, come back tomorrow — every app is exactly where you left it. The entire state tree, including all functions and behavior, serializes to JSON. Nothing is lost between sessions.
The agent doesn't just build the app and leave. It stays inside it. Any application can reason, understand language, and access general knowledge at runtime — because the same AI that wrote the code is available as a function call from within it. There are no external APIs, no keys to manage, no separate backend. Intelligence is a native capability of every app.
Type "two eggs and toast with butter" and the agent estimates calories, protein, carbs, fat, and fiber. No food database. The model is the database.
Paste a paragraph, the app sends it to the agent with target language context. Because it's the same session, it remembers previous translations and maintains consistency across a document.
A note-taking app where you highlight any passage and ask "explain this" or "find counterarguments". The agent operates on your data, in your context, with full conversation history.
Any app can delegate understanding to the agent. Form validation from field names. Smart defaults from partial input. Classification, extraction, summarization — all available as a function call.