Chat API

  • The Chat API currently do not support the following parameters:

    • model, temperate, max_tokens

    • stream will be set to false ( xNomad Agents currently do not support streaming

post

Generates a model response for the given chat conversation.

Authorizations
Body
modelstringOptional

ID of the model to use.

temperaturenumber · max: 2 · nullableOptional

What sampling temperature to use, between 0 and 2.

Default: 1Example: 0
max_tokensinteger · min: 1 · nullableOptional

The maximum number of tokens to generate in the chat completion.

Default: 1Example: 0
streamboolean · nullableOptional

If set, partial message deltas will be sent.

Default: falseExample: false
Responses
chevron-right
200

OK

application/json
idstringRequired

A unique identifier for the chat completion.

Example: chatcmpl-prLvANKtOJErUM8VivHyUvhJYm2jH
objectstring · enumRequired

The object type, which is always 'chat.completion'.

Example: chat.completionPossible values:
createdintegerRequired

The Unix timestamp (in seconds) of when the chat completion was created.

Example: 1745995327
modelstringOptional

The model used for the chat completion.

post
/v1/chat/completions
200

OK

Last updated

Was this helpful?