ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 16 - Certified AI Specialist discussion

Report
Export

An AI Specialist built a Field Generation prompt template that worked for many records, but users are reporting random failures with token limit errors.

What is the cause of the random nature of this error?

A.
The number of tokens generated by the dynamic nature of the prompt template will vary by record.
Answers
A.
The number of tokens generated by the dynamic nature of the prompt template will vary by record.
B.
The template type needs to be switched to Flex to accommodate the variable amount of tokens generated by the prompt grounding.
Answers
B.
The template type needs to be switched to Flex to accommodate the variable amount of tokens generated by the prompt grounding.
C.
The number of tokens that can be processed by the LLM varies with total user demand.
Answers
C.
The number of tokens that can be processed by the LLM varies with total user demand.
Suggested answer: A

Explanation:

The reason behind the token limit errors lies in the dynamic nature of the prompt template used in Field Generation. In Salesforce's AI generative models, each prompt and its corresponding output are subject to a token limit, which encompasses both the input and output of the large language model (LLM). Since the prompt template dynamically adjusts based on the specific data of each record, the number of tokens varies per record. Some records may generate longer outputs based on their data attributes, pushing the token count beyond the allowable limit for the LLM, resulting in token limit errors.

This behavior explains why users experience random failures---it is dependent on the specific data used in each case. For certain records, the combined input and output may fall within the token limit, while for others, it may exceed it. This variation is intrinsic to how dynamic templates interact with large language models.

Salesforce provides guidance in their documentation, stating that prompt template design should take into account token limits and suggests testing with varied records to avoid such random errors. It does not mention switching to Flex template type as a solution, nor does it suggest that token limits fluctuate with user demand. Token limits are a constant defined by the model itself, independent of external user load.

Salesforce Developer Documentation on Token Limits for Generative AI Models

Salesforce AI Best Practices on Prompt Design (Trailhead or Salesforce blog resources)

asked 26/09/2024
Sankalp Wadiwa
34 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first