cURL
curl --request POST \ --url https://906e-2001-5a8-6cc-e000-55c0-f0b-9da7-10b0.ngrok-free.app/chat/completions \ --header 'Content-Type: application/json' \ --data '{ "model": "<string>", "messages": [ { "role": "system", "content": "<string>" } ], "search": { "domain_filter": [ "<string>" ], "recency": "day" }, "stream": true, "user_context": {} }'
{ "request_id": "<string>", "data": { "choices": [ { "index": 123, "message": { "role": "system", "content": "<string>" } } ] }, "meta": { "api_version": "<string>", "processing_ms": 123, "tokens_used": 123, "cost": { "total": 1, "breakdown": {} } } }
curl -X POST https://api.agentserp.com/chat/completions \ -H "Authorization: Bearer <token>" \ -H "Content-Type: application/json" \ -d '{"model": "gpt-4o-mini", "messages": [{"role": "user", "content": "Hello, who are you?"}]}'
Input for POST /chat/completions
LLM engine to use
1
Show child attributes
Optional search-control block for chat
If true, SSE stream; else aggregate JSON
Opaque dict preserved across turns
Chat response
Universal response wrapper
Endpoint-specific payload
Echo of client-supplied ID or server-assigned UUID
Metadata block attached to every response