"MistralAI" (Service Connection)
This service connection requires an external account »
Use the Mistral AI API with the Wolfram Language.
Connecting & Authenticating
ServiceConnect["MistralAI"] creates a connection to the Mistral AI API. If a previously saved connection can be found, it will be used; otherwise, a new authentication request will be launched.
Requests
ServiceExecute["MistralAI","request",params] sends a request to the Mistral AI API using parameters params. The following gives possible requests.
"TestConnection" — returns Success for working connection, Failure otherwise
Text
"Chat" — create a response for the given chat conversation
"Messages" | (required) | a list of messages in the conversation | |
"MaxTokens" | Automatic | maximum number of tokens to generate | |
"Model" | Automatic | name of the model to use | |
"Stream" | False | return the result as server-sent events | |
"Temperature" | Automatic | sampling temperature (between 0 and 1) | |
"ToolChoice" | Automatic | which (if any) tool is called by the model | |
"Tools" | Automatic | one or more LLMTool objects available to the model | |
"TopProbabilities" | Automatic | sample only among the k highest-probability classes | |
"TotalProbabilityCutoff" | None | sample among the most probable classes with an accumulated probability of at least p (nucleus sampling) |
"Embedding" — create an embedding vector representing the input text
"Input" | (required) | one or a list of texts to get embeddings for | |
"Model" | Automatic | name of the model to use |
Model Lists
"ChatModelList" — list models available for the "Chat" request
"EmbeddingModelList" — list models available for the "Embedding" request