Chat API
The Chat API currently do not support the following parameters:
model, temperate, max_tokens
stream will be set to false ( xNomad Agents currently do not support streaming
post
Generates a model response for the given chat conversation.
Authorizations
Body
modelstringOptional
ID of the model to use.
temperaturenumber | nullableOptionalDefault:
What sampling temperature to use, between 0 and 2.
1Example: 0max_tokensinteger | nullableOptionalDefault:
The maximum number of tokens to generate in the chat completion.
1Example: 0streamboolean | nullableOptionalDefault:
If set, partial message deltas will be sent.
falseExample: falseResponses
200
OK
application/json
post
/v1/chat/completions200
OK
Last updated
Was this helpful?