API Reference
The Gambi hub exposes an OpenAI-compatible HTTP API. Any tool or library that works with the OpenAI API can work with Gambi by changing the base URL.
The project was previously published as gambiarra. The current public metadata key in /v1/models is gambi.
Base URL
Section titled “Base URL”All LLM endpoints are scoped to a room:
http://<hub-host>:<port>/rooms/<ROOM_CODE>/v1/Example: http://192.168.1.100:3000/rooms/ABC123/v1/
Authentication
Section titled “Authentication”No authentication is required. Gambi is designed for trusted local networks. CORS is enabled for all origins.
Model Routing
Section titled “Model Routing”The model field in requests controls which participant handles it:
| Value | Behavior | Example |
|---|---|---|
* or any | Random online participant | "model": "*" |
model:<name> | First online participant with that model | "model": "model:llama3" |
<participant-id> | Specific participant by ID | "model": "abc123" |
<model-name> | Tries participant ID first, then model match | "model": "llama3" |
LLM Endpoints
Section titled “LLM Endpoints”POST /rooms/:code/v1/chat/completions
Section titled “POST /rooms/:code/v1/chat/completions”OpenAI-compatible chat completions proxy. Supports streaming.
Request:
{ "model": "*", "messages": [ { "role": "system", "content": "You are a helpful assistant" }, { "role": "user", "content": "Hello!" } ], "stream": false, "temperature": 0.7, "max_tokens": 500}curl (non-streaming):
curl -X POST http://localhost:3000/rooms/ABC123/v1/chat/completions \ -H "Content-Type: application/json" \ -d '{ "model": "*", "messages": [{"role": "user", "content": "Hello!"}] }'curl (streaming):
curl -X POST http://localhost:3000/rooms/ABC123/v1/chat/completions \ -H "Content-Type: application/json" \ -d '{ "model": "*", "messages": [{"role": "user", "content": "Hello!"}], "stream": true }'Streaming responses use Server-Sent Events (data: {...} chunks, terminated by data: [DONE]).
POST /rooms/:code/v1/responses
Section titled “POST /rooms/:code/v1/responses”OpenAI Responses API proxy (primary protocol). Supports streaming.
Request:
{ "model": "*", "input": "Hello!", "stream": false}With instructions:
{ "model": "*", "input": "Translate to Portuguese", "instructions": "You are a translator", "stream": true}curl:
curl -X POST http://localhost:3000/rooms/ABC123/v1/responses \ -H "Content-Type: application/json" \ -d '{"model": "*", "input": "Hello!"}'GET /rooms/:code/v1/models
Section titled “GET /rooms/:code/v1/models”List available models and participants in the room. Returns OpenAI-compatible format.
curl:
curl http://localhost:3000/rooms/ABC123/v1/modelsResponse:
{ "object": "list", "data": [ { "id": "participant-id", "object": "model", "created": 1234567890, "owned_by": "alice", "gambi": { "nickname": "alice", "model": "llama3", "endpoint": "http://192.168.1.10:11434", "capabilities": { "openResponses": "supported", "chatCompletions": "supported" } } } ]}Room Management
Section titled “Room Management”POST /rooms
Section titled “POST /rooms”Create a new room.
curl -X POST http://localhost:3000/rooms \ -H "Content-Type: application/json" \ -d '{"name": "My Room"}'Optional: "password": "secret" for password-protected rooms.
Response:
{ "room": { "id": "uuid", "code": "ABC123", "name": "My Room", "hostId": "uuid", "createdAt": 1234567890 }, "hostId": "uuid"}GET /rooms
Section titled “GET /rooms”List all rooms.
curl http://localhost:3000/roomsPOST /rooms/:code/join
Section titled “POST /rooms/:code/join”Register a participant in a room.
curl -X POST http://localhost:3000/rooms/ABC123/join \ -H "Content-Type: application/json" \ -d '{ "id": "my-unique-id", "nickname": "alice", "model": "llama3", "endpoint": "https://openrouter.ai/api", "authHeaders": { "Authorization": "Bearer sk-..." } }'Optional fields: "password", "specs", "config", "capabilities", "authHeaders".
authHeaders are kept only in hub memory and are never returned by participant listing endpoints.
DELETE /rooms/:code/leave/:participantId
Section titled “DELETE /rooms/:code/leave/:participantId”Remove a participant from a room.
curl -X DELETE http://localhost:3000/rooms/ABC123/leave/my-unique-idGET /rooms/:code/participants
Section titled “GET /rooms/:code/participants”List all participants in a room with their status.
curl http://localhost:3000/rooms/ABC123/participantsPOST /rooms/:code/health
Section titled “POST /rooms/:code/health”Send a health check heartbeat. Participants must send this every 10 seconds to stay online. After 30 seconds without a heartbeat, the participant is marked offline.
curl -X POST http://localhost:3000/rooms/ABC123/health \ -H "Content-Type: application/json" \ -d '{"id": "my-participant-id"}'GET /health
Section titled “GET /health”Check if the hub is running.
curl http://localhost:3000/healthSSE Events
Section titled “SSE Events”GET /rooms/:code/events
Section titled “GET /rooms/:code/events”Server-Sent Events stream for real-time monitoring.
curl -N http://localhost:3000/rooms/ABC123/eventsEvents emitted:
room:created— new room createdparticipant:joined— participant joined the roomparticipant:left— participant left the roomparticipant:offline— participant stopped sending health checksllm:request— request was routed to a participantllm:error— request to a participant failed
Using with AI Tools
Section titled “Using with AI Tools”Point any tool that accepts an OpenAI-compatible base URL to:
http://<hub>:<port>/rooms/<CODE>/v1Lovable / Cursor / any OpenAI-compatible client
Section titled “Lovable / Cursor / any OpenAI-compatible client”- Base URL:
http://192.168.1.100:3000/rooms/ABC123/v1 - API Key: any string (not validated)
- Model:
*(any available) or a specific model name
Python (openai library)
Section titled “Python (openai library)”from openai import OpenAI
client = OpenAI( base_url="http://192.168.1.100:3000/rooms/ABC123/v1", api_key="not-needed")
response = client.chat.completions.create( model="*", messages=[{"role": "user", "content": "Hello!"}])print(response.choices[0].message.content)JavaScript (fetch)
Section titled “JavaScript (fetch)”const response = await fetch( "http://localhost:3000/rooms/ABC123/v1/chat/completions", { method: "POST", headers: { "Content-Type": "application/json" }, body: JSON.stringify({ model: "*", messages: [{ role: "user", content: "Hello!" }], }), });const data = await response.json();console.log(data.choices[0].message.content);