LLMResourceFunction
LLMResourceFunction["name"]
retrieves an LLMFunction with the specified name.
LLMResourceFunction[loc]
imports an LLMFunction from the specified location.
LLMResourceFunction[…][params]
applies the specified LLMFunction to the parameters params.
Details and Options
- LLMResourceFunction can retrieve resources stored locally, in the cloud or in the Wolfram Prompt Repository.
- LLMResourceFunction requires external service authentication, billing and internet connectivity.
- In LLMResourceFunction["name"], the name must be published in a public repository, registered using ResourceRegister or previously deployed from a definition notebook.
- LLMResourceFunction[loc] accepts locations of previous prompt resource deployments including LocalObject and CloudObject.
- The LLMFunction returned by LLMResourceFunction can contain a prompt as well as interpretation and configurations settings. Use LLMPrompt to access the prompt directly.
- LLMResourceFunction supports the following options:
-
Authentication Automatic explicit user ID and API key LLMEvaluator $LLMEvaluator LLM configuration to use - LLMEvaluator can be set to an LLMConfiguration object or an association with any of the following keys:
-
"Model" base model "Temperature" sampling temperature "TotalProbabilityCutoff" sampling probability cutoff (nucleus sampling) "Prompts" prompts "PromptDelimiter" delimiter to use between prompts "StopTokens" tokens on which to stop generation "Tools" list of LLMTool objects to use "ToolPrompt" prompt for specifying tool format "ToolRequestParser" function for parsing tool requests "ToolResponseString" function for serializing tool responses - Valid forms of "Model" include:
-
name named model {service,name} named model from service <"Service"service,"Name"name,"Task"task > fully specified model - The generated text is sampled from a distribution. Details of the sampling can be specified using the following properties of the LLMEvaluator:
-
"Temperature"t Automatic sample using a positive temperature t "TopProbabilityCutoff"p Automatic sample among the most probable choices with an accumulated probability of at least p (nucleus sampling) - The setting "Temperature"Automatic resolves to zero temperature within LLMFunction.
- Multiple prompts are separated by the "PromptDelimiter" property of the LLMEvaluator.
- Possible values for Authentication are:
-
Automatic choose the authentication scheme automatically Environment check for a key in the environment variables SystemCredential check for a key in the system keychain ServiceObject[…] inherit the authentication from a service object assoc provide explicit key and user ID - With AuthenticationAutomatic, the function checks the variable "OPENAI_API_KEY" in Environment and SystemCredential; otherwise, it uses ServiceConnect["OpenAI"].
- When using Authenticationassoc, assoc can contain the following keys:
-
"ID" user identity "APIKey" API key used to authenticate - LLMResourceFunction uses machine learning. Its methods, training sets and biases included therein may change and yield varied results in different versions of the Wolfram Language.
Examples
open allclose allBasic Examples (2)
Scope (1)
Retrieve a prompt from a ResourceObject:
Options (1)
LLMEvaluator (1)
With the default evaluator, LLMResourceFunction uses a zero temperature and usually gives the same result for the same input:
Use the LLMEvaluator option to set a nonzero temperature in order to create more random results:
Text
Wolfram Research (2023), LLMResourceFunction, Wolfram Language function, https://reference.wolfram.com/language/ref/LLMResourceFunction.html.
CMS
Wolfram Language. 2023. "LLMResourceFunction." Wolfram Language & System Documentation Center. Wolfram Research. https://reference.wolfram.com/language/ref/LLMResourceFunction.html.
APA
Wolfram Language. (2023). LLMResourceFunction. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/LLMResourceFunction.html