Plug naina into
your AI coding tool.
One command in your terminal. naina mcp starts the local server. Your AI tool talks to it over stdio — no wizard, no hosted endpoint, no context-switch. Your code, your machine, your call.
One prompt. A whole running app.
Describe the product. Your AI tool calls naina, naina returns a tarball, you ship it.
The MCP server exposes naina's full surface as tools: build a design system, generate a homepage, scaffold the app, render the deploy config. Each call is deterministic — same input, same artifact.
{
"mcpServers": {
"naina": {
"command": "naina",
"args": ["mcp"]
}
}
}
The tools naina exposes.
go run.→ tarballdeploy.yml for your VPS — registry, hosts, healthcheck, secrets.→ deploy.ymlTools are deterministic. Same input, same output. The AI doesn't generate code — it picks which tool to call.
Works with any MCP client.
Two lines of JSON
Drop the snippet into your client's mcp.json. Restart. The tools show up as autocomplete. That's the entire install.
stdio · no ports
naina speaks the MCP stdio transport. No background daemons, no listening sockets, no firewall rules. Just a process the editor spawns when it needs it.
Local-first by default
Everything runs in the same working directory your editor sits in. Tarballs land next to the rest of your code, not on someone else's S3.
Ready to ship?
Wire it into your editor.
Two lines of JSON. One command. A running app at the end of the same prompt.