Skip to main content
Chat with your agents using an OpenAI-compatible API. Supports both streaming and non-streaming responses.

Examples

import requests

url = "https://labs.chonkie.ai/api/v1/chat/completions"
headers = {
    "Authorization": "Bearer YOUR_API_KEY",
    "Content-Type": "application/json"
}

data = {
    "model": "documentation-assistant",
    "messages": [
        {"role": "user", "content": "How do I configure authentication?"}
    ]
}

response = requests.post(url, headers=headers, json=data)
result = response.json()

assistant_message = result["choices"][0]["message"]["content"]
print(f"Assistant: {assistant_message}")

OpenAI SDK Compatibility

Chonkie agents are fully compatible with the OpenAI SDK. Simply configure the base URL and use your agent slug as the model name.
from openai import OpenAI

client = OpenAI(
    api_key="YOUR_API_KEY",
    base_url="https://labs.chonkie.ai/api/v1"
)

response = client.chat.completions.create(
    model="documentation-assistant",  # Use your agent slug
    messages=[
        {"role": "user", "content": "How do I configure authentication?"}
    ]
)

print(response.choices[0].message.content)

Request

Parameters

model
string
required
The agent slug to use (replaces OpenAI’s model parameter).
messages
array
required
Array of message objects in OpenAI format.
stream
boolean
default:"false"
Enable streaming responses.
max_tokens
integer
Maximum tokens in response.

Response

Returns (Non-streaming)

OpenAI-compatible chat completion response.

Returns (Streaming)

Server-Sent Events (SSE) with delta chunks.
I