A protocol for bootstrapping artificial minds
Sense: Voice — speak through any OpenAI-compatible interface.
Responses exposes your agent as an HTTP server compatible with the OpenAI Responses API. Any client that speaks the OpenAI SDK — web UIs, mobile apps, scripts, other agents — can talk to your agent over HTTP.
Once the server is running, any OpenAI SDK client can connect:
import OpenAI from 'openai';
const client = new OpenAI({
baseURL: 'http://127.0.0.1:15210/v1',
apiKey: 'unused',
});
const response = await client.responses.create({
model: 'copilot',
input: 'What PRs need review?',
});
console.log(response.output_text);
Or with curl:
curl -s -X POST http://127.0.0.1:15210/v1/responses \
-H "Content-Type: application/json" \
-d '{"model":"copilot","input":"Hello!"}'
| Method | Path | Description |
|---|---|---|
POST |
/v1/responses |
Send a message (supports streaming) |
GET |
/history?limit=N |
Retrieve conversation history |
GET |
/health |
Liveness check |
curl -N -X POST http://127.0.0.1:15210/v1/responses \
-H "Content-Type: application/json" \
-d '{"model":"copilot","input":"Explain this codebase","stream":true}'
Stream events follow the OpenAI SSE format. Use response.output_text.delta events for incremental text.
For inter-agent communication or any scenario where the caller shouldn’t block while
the agent thinks, set async: true:
curl -s -X POST http://127.0.0.1:15210/v1/responses \
-H "Content-Type: application/json" \
-d '{"model":"copilot","input":"Build the auth module","async":true}'
Returns 201 Accepted immediately with a correlation ID:
{ "id": "resp_abc123...", "status": "accepted", "created_at": 1710523200 }
The agent processes the request in the background. Without async: true, behavior
is unchanged — the request blocks until the agent responds.
Result delivery back to the caller is planned for a future release. See issue #38.
| Tool | Description |
|---|---|
responses_status |
Show server status, port, and endpoints |
responses_restart |
Restart the server (optionally on a different port) |
The server binds to 127.0.0.1 only — not exposed to the network. To make it reachable externally, pair it with a Dev Tunnel or reverse proxy.
The server starts with the session and stops when the session ends. On /clear, the CLI restarts the extension process — external clients should implement retry logic for brief connection-refused windows during transitions.
Full details: extension README