Skip to main content

POST /chat/send-chat-message

message
string
required
The user message to send to the Agent.
llm_override
object
Override the default LLM settings for this request. The fields can be left as None to not impact the default behavior. For example, you can only update the temperature value alone if desired. Note that if you provide an invalid configuration, for example if the default model_provider is OpenAI and you specify claude-4.5, it will not be able to complete the request.
allowed_tool_ids
list[int]
The set of tools/actions made available to the Agent. You can get the list of tools/actions and their IDs via the tools endpoint. You can check the list by sending a GET to https://cloud.onyx.app/api/tool or your own Onyx deployment. Pass in an empty list to not allow any tools/actions or pass in a null value to allow all the tools/actions which are available to the specific Agent of the chat session.
forced_tool_id
int
A specific tool/action which must be run by the Agent. The Agent may run other tools/actions before returning its final response to the user but it will be guaranteed to use this one. Leave empty to leave the choice entirely to the Agent.
file_descriptors
list[int]
A list of file IDs to include along with the message. These file IDs are given back when the file upload API is called.
search_filters
object
Filters to narrow down the internal search results used by the Agent. All filters are optional and can be combined.
deep_research
boolean
Turn on to use the Deep Research flow. Note that this mode is much more token-intensive, so be careful if accessing it programmatically.
parent_message_id
integer
The ID of the parent message in the chat history. This is the primary-key (unique identifier) for the previous message of the tree. If not passed in, it is assumed that there is no branching/editing of the last message and it is a new message on top of it. If set to null explicitly, the new message is considered an edit of the root message instead.
chat_session_id
integer
The ID of the chat session where the message is sent. To send follow-ups in a conversation, specify the chat session where the message should be sent. If left blank, a new chat session will be created for the message according to chat_session_info (see below).
chat_session_info
object
Specify details about the chat session which will be used for all messages in the session. The field values can be left blank to use the default chat settings.
stream
boolean
If true, then it responds with an SSE stream of individual packets. This is the same set used for the Onyx UI. Fields like the Answer, reasoning tokens, and iterative Tool Calls need to be pieced together from streamed tokens.If false, a single response is given at the end with the fields described in the section below. Recommended for most developers first trying out Onyx.

Response Format

The response will come back with all of the following:
  • Answer to the query
  • Answer with citations removed
  • Any intermediate reasoning
  • Tool call details (reasoning, call arguments, tool responses)
  • Any referenced documents and citations
  • Chat Session ID to continue the conversation
  • Message ID for use cases where conversation branching logic is needed
  • Any errors that occurred during the call

Sample Request

import requests

API_BASE_URL = "https://cloud.onyx.app/api"  # or your own domain
API_KEY = "YOUR_KEY_HERE"

headers = {
    "Authorization": f"Bearer {API_KEY}",
    "Content-Type": "application/json"
}

response = requests.post(
    f"{API_BASE_URL}/chat/send-chat-message",
    headers=headers,
    json={
        "message": "What is Onyx?",
    }
)

data = response.json()
print("Answer:", data["answer"])
print("Message ID:", data["message_id"])

Next Steps