Documentation Index
Fetch the complete documentation index at: https://docs.neuraldraft.io/llms.txt
Use this file to discover all available pages before exploring further.
@neuraldraft/mcp is a stdio MCP server published on npm. Every supported
client spawns it as a local process and talks JSON-RPC over stdio. You only
need three things:
- Node 18.18 or newer.
- A Neural Draft API key (
ndsk_live_...orndsk_test_...). - Your editor of choice.
Claude Code
The fastest path. The CLI handles everything:~/.config/claude-code/mcp.json (or your
platform’s equivalent). To verify, restart Claude Code and run /mcp — you
should see neural-draft listed under “Connected servers”.
If you’d rather edit the config by hand:
Cursor
Cursor reads.cursor/mcp.json either at the workspace root (project-scoped)
or at ~/.cursor/mcp.json (global).
neural-draft appears
under “MCP”. When you ask Cursor to generate a section, it’ll auto-call
register_component.
Putting an API key in a workspace file checked into git is a footgun. Use
the global
~/.cursor/mcp.json for personal keys, or a .cursor/mcp.json
with an env var indirection (e.g. "NEURAL_DRAFT_API_KEY": "${env:NEURAL_DRAFT_API_KEY}")
and .gitignore the file.Continue
Add the server to~/.continue/config.yaml:
Lovable, v0, Bolt — the workaround
These hosted tools don’t yet support MCP. For now, paste the system prompt at frameworks/lovable into your build prompt — it teaches the model the same conventions the MCP server enforces. When Lovable / v0 / Bolt add MCP support, this server will work as-is.Multiple projects on one machine
The cleanest pattern is one server entry per project, named for the project. The AI client UI shows them separately, so you can flip between projects without changing env vars:Environment variables
The server reads three env vars:| Variable | Required | Default |
|---|---|---|
NEURAL_DRAFT_API_KEY | Yes | — |
NEURAL_DRAFT_API_URL | No | https://api.neuraldraft.io |
NEURAL_DRAFT_PROJECT_ID | No | (resolved from the API key) |
NEURAL_DRAFT_API_URL only when pointing at a self-hosted API or our
staging environment (https://api.staging.neuraldraft.io).
Verifying it works
After install, ask the AI:“Read the Neural Draft brand context for this project.”You should see the AI invoke the
brand://current resource. If it doesn’t,
the server didn’t connect. Common causes:
| Symptom | Fix | |
|---|---|---|
npx: command not found | Install Node 18.18+. node --version. | |
| Server starts then crashes immediately | Check the API key prefix; it must match `ndsk_(live | test)_…`. The server validates on boot. |
| AI doesn’t see the tools | Most clients require a workspace reload after editing the MCP config. Restart the editor. | |
| Cursor shows “neural-draft” but no tools | Stale Node cache. npx -y @neuraldraft/mcp@latest to force re-download. | |
429 on every tool call | You’re inside a tight loop. Throttle your AI’s session, or upgrade to a higher rate limit. |
http://localhost:6274 and try brand://current directly.
What you get next
Tools and resources
The full reference: every tool, every resource, every parameter.
Snippets
Drop-in framework integrations once you’ve got the AI generating Neural
Draft-native code.