OpenAI-compatible chat completions endpoint with streaming support. This endpoint allows applications to send messages to Pulse AI Research Agent and receive streaming responses in OpenAI-compatible format. It processes user messages through the appropriate agent based on the requested model and maintains conversation history between requests.
End markers: {"choices":[{"delta":{},"finish_reason":"stop","index":0}]} followed by [DONE]
Request
Header Params
x-user-email
string
optional
Example:
corey@lineupiq.io
x-app-id
string<password>
required
Default:
{{x-app-id}}
x-api-key
string<password>
required
Default:
{{x-api-key}}
Content-Type
string
optional
Default:
application/json
Body Params application/json
model
string
required
ID of the model to use (e.g., 'control-4v1', 'discovery-pulse-4v1')
Example:
control-4v1
messages
array [object {2}]
required
A list of messages comprising the conversation so far
role
enum<string>
required
The role of the message author
Allowed values:
systemuserassistant
Example:
user
content
string
required
The content of the message
Example:
Tell me about LAIRA Core
stream
boolean
optional
Whether to stream the response (recommended for this endpoint)
Default:
true
Example:
true
Example
{"model":"discovery-pulse-4v1","messages":[{"role":"user","content":"I'm looking for venues in Edmonton that can host a rock concert with about 2000 attendees in August."}]}
Request samples
Shell
JavaScript
Java
Swift
Go
PHP
Python
HTTP
C
C#
Objective-C
Ruby
OCaml
Dart
R
Request Request Example
Shell
JavaScript
Java
Swift
curl--location--request POST '/v1/pulse/chat/completions' \
--header'x-user-email: corey@lineupiq.io' \
--header'x-app-id;' \
--header'x-api-key;' \
--header'Content-Type: application/json' \
--data-raw'{
"model": "discovery-pulse-4v1",
"messages": [
{
"role": "user",
"content": "I'\''m looking for venues in Edmonton that can host a rock concert with about 2000 attendees in August."
}
]
}'