"Anthropic" (Service Connection)
This service connection requires LLM access »
Connecting & Authenticating
ServiceConnect["Anthropic"] creates a connection to the Anthropic API. If a previously saved connection can be found, it will be used; otherwise, a new authentication request will be launched.
Requests
ServiceExecute["Anthropic","request",params] sends a request to the Anthropic API using parameters params. The following gives possible requests.
"Chat" — create a response for the given chat conversation
| "Messages" | (required) | a list of messages in the conversation, each given as an association with "Role" and "Content" keys | |
| "MaxTokens" | Automatic | maximum number of tokens to generate | |
| "Metadata" | Automatic | metadata about the request | |
| "Model" | Automatic | name of the model to use | |
| "StopTokens" | None | up to four strings where the API will stop generating further tokens | |
| "Stream" | False | return the result as server-sent events | |
| "Temperature" | Automatic | sampling temperature (between 0 and 1) | |
| "TopProbabilities" | Automatic | sample only among the k highest-probability classes | |
| "TotalProbabilityCutoff" | None | sample among the most probable classes with an accumulated probability of at least p (nucleus sampling) |
Examples
open all close allScope (1)
Chat (1)
Respond to a chat containing multiple messages:
Change the sampling temperature:
Increase the number of characters returned:
Allow the model to use an LLMTool:
See Also
ServiceExecute ▪ ServiceConnect ▪ LLMFunction ▪ LLMSynthesize ▪ ChatEvaluate ▪ LLMConfiguration
Service Connections: AlephAlpha ▪ Cohere ▪ DeepSeek ▪ GoogleGemini ▪ Groq ▪ MistralAI ▪ OpenAI ▪ TogetherAI