Chat API

  • The Chat API currently do not support the following parameters:

    • model, temperate, max_tokens

    • stream will be set to false ( xNomad Agents currently do not support streaming

post

Generates a model response for the given chat conversation.

Authorizations
Body
modelstringOptional

ID of the model to use.

temperaturenumber | nullableOptional

What sampling temperature to use, between 0 and 2.

Default: 1Example: 0
max_tokensinteger | nullableOptional

The maximum number of tokens to generate in the chat completion.

Default: 1Example: 0
streamboolean | nullableOptional

If set, partial message deltas will be sent.

Default: falseExample: false
Responses
200

OK

application/json
post
/v1/chat/completions
200

OK

Last updated

Was this helpful?