LLMResourceFunction

LLMResourceFunction["name"]

retrieves an LLMFunction with the specified name.

LLMResourceFunction[loc]

imports an LLMFunction from the specified location.

LLMResourceFunction[][params]

applies the specified LLMFunction to the parameters params.

Details and Options

  • LLMResourceFunction can retrieve resources stored locally, in the cloud or in the Wolfram Prompt Repository.
  • LLMResourceFunction requires external service authentication, billing and internet connectivity.
  • In LLMResourceFunction["name"], the name must be published in a public repository, registered using ResourceRegister or previously deployed from a definition notebook.
  • LLMResourceFunction[loc] accepts locations of previous prompt resource deployments including LocalObject and CloudObject.
  • The LLMFunction returned by LLMResourceFunction can contain a prompt as well as interpretation and configurations settings. Use LLMPrompt to access the prompt directly.
  • LLMResourceFunction supports the following options:
  • AuthenticationAutomaticexplicit user ID and API key
    LLMEvaluator $LLMEvaluatorLLM configuration to use
  • LLMEvaluator can be set to an LLMConfiguration object or an association with any of the following keys:
  • "Model"base model
    "Temperature"sampling temperature
    "TotalProbabilityCutoff"sampling probability cutoff (nucleus sampling)
    "Prompts"prompts
    "PromptDelimiter"delimiter to use between prompts
    "StopTokens"tokens on which to stop generation
    "Tools"list of LLMTool objects to use
    "ToolPrompt"prompt for specifying tool format
    "ToolRequestParser"function for parsing tool requests
    "ToolResponseString"function for serializing tool responses
  • Valid forms of "Model" include:
  • namenamed model
    {service,name}named model from service
    <|"Service"service,"Name"name,"Task"task|>fully specified model
  • The generated text is sampled from a distribution. Details of the sampling can be specified using the following properties of the LLMEvaluator:
  • "Temperature"tAutomaticsample using a positive temperature t
    "TopProbabilityCutoff"pAutomaticsample among the most probable choices with an accumulated probability of at least p (nucleus sampling)
  • The setting "Temperature"Automatic resolves to zero temperature within LLMFunction.
  • Multiple prompts are separated by the "PromptDelimiter" property of the LLMEvaluator.
  • Possible values for Authentication are:
  • Automaticchoose the authentication scheme automatically
    Environmentcheck for a key in the environment variables
    SystemCredentialcheck for a key in the system keychain
    ServiceObject[]inherit the authentication from a service object
    assocprovide explicit key and user ID
  • With AuthenticationAutomatic, the function checks the variable "OPENAI_API_KEY" in Environment and SystemCredential; otherwise, it uses ServiceConnect["OpenAI"].
  • When using Authenticationassoc, assoc can contain the following keys:
  • "ID"user identity
    "APIKey"API key used to authenticate
  • LLMResourceFunction uses machine learning. Its methods, training sets and biases included therein may change and yield varied results in different versions of the Wolfram Language.

Examples

open allclose all

Basic Examples  (2)

Retrieve an LLMFunction from the Wolfram Prompt Repository:

Immediately use a prompt:

Scope  (1)

Retrieve a prompt from a ResourceObject:

Options  (1)

LLMEvaluator  (1)

With the default evaluator, LLMResourceFunction uses a zero temperature and usually gives the same result for the same input:

Use the LLMEvaluator option to set a nonzero temperature in order to create more random results:

Wolfram Research (2023), LLMResourceFunction, Wolfram Language function, https://reference.wolfram.com/language/ref/LLMResourceFunction.html.

Text

Wolfram Research (2023), LLMResourceFunction, Wolfram Language function, https://reference.wolfram.com/language/ref/LLMResourceFunction.html.

CMS

Wolfram Language. 2023. "LLMResourceFunction." Wolfram Language & System Documentation Center. Wolfram Research. https://reference.wolfram.com/language/ref/LLMResourceFunction.html.

APA

Wolfram Language. (2023). LLMResourceFunction. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/LLMResourceFunction.html

BibTeX

@misc{reference.wolfram_2023_llmresourcefunction, author="Wolfram Research", title="{LLMResourceFunction}", year="2023", howpublished="\url{https://reference.wolfram.com/language/ref/LLMResourceFunction.html}", note=[Accessed: 27-February-2024 ]}

BibLaTeX

@online{reference.wolfram_2023_llmresourcefunction, organization={Wolfram Research}, title={LLMResourceFunction}, year={2023}, url={https://reference.wolfram.com/language/ref/LLMResourceFunction.html}, note=[Accessed: 27-February-2024 ]}