| model |
string |
Provider and model name in the format provider/model-name (e.g., openai/gpt-4o, anthropic/claude-3-5-sonnet-20241022). |
| messages |
array |
Array of messages comprising the conversation. |
| temperature |
number |
Sampling temperature between 0 and 2. |
| max_tokens |
integer |
Maximum number of tokens to generate. |
| stream |
boolean |
Whether to stream the response as SSE. |
| top_p |
number |
Nucleus sampling probability. |
| n |
integer |
Number of completions to generate. |