Class GenericOaiQueryConfig
java.lang.Object
de.xima.fc.prompt.ms.impl.service.generic_oai.service.model.GenericOaiQueryConfig
- All Implemented Interfaces:
IPromptQueryConfig_CumulativeProbabilityThreshold, IPromptQueryConfig_EffortLevel, IPromptQueryConfig_Files, IPromptQueryConfig_FrequencyPenalty, IPromptQueryConfig_GeneratedTokenLimit, IPromptQueryConfig_JsonSchemaOutput, IPromptQueryConfig_PresencePenalty, IPromptQueryConfig_SystemPrompt, IPromptQueryConfig_Temperature, IPromptQueryConfig_TextParameters, IPromptQueryConfig_UserPrompt, Serializable
public class GenericOaiQueryConfig
extends Object
implements Serializable, IPromptQueryConfig_CumulativeProbabilityThreshold, IPromptQueryConfig_EffortLevel, IPromptQueryConfig_GeneratedTokenLimit, IPromptQueryConfig_Files, IPromptQueryConfig_FrequencyPenalty, IPromptQueryConfig_PresencePenalty, IPromptQueryConfig_TextParameters, IPromptQueryConfig_JsonSchemaOutput, IPromptQueryConfig_SystemPrompt, IPromptQueryConfig_Temperature, IPromptQueryConfig_UserPrompt
Model class for the configuration of a generic OAI prompt query.
- Since:
- 8.5.0
- See Also:
-
Field Summary
Fields inherited from interface IPromptQueryConfig_CumulativeProbabilityThreshold
ATTR_CUMULATIVE_PROBABILITY_THRESHOLDFields inherited from interface IPromptQueryConfig_EffortLevel
ATTR_EFFORT_LEVELFields inherited from interface IPromptQueryConfig_Files
ATTR_FILESFields inherited from interface IPromptQueryConfig_FrequencyPenalty
ATTR_FREQUENCY_PENALTYFields inherited from interface IPromptQueryConfig_GeneratedTokenLimit
ATTR_GENERATED_TOKEN_LIMITFields inherited from interface IPromptQueryConfig_JsonSchemaOutput
ATTR_JSON_SCHEMA_OUTPUTFields inherited from interface IPromptQueryConfig_PresencePenalty
ATTR_PRESENCE_PENALTYFields inherited from interface IPromptQueryConfig_SystemPrompt
ATTR_SYSTEM_PROMPTFields inherited from interface IPromptQueryConfig_Temperature
ATTR_TEMPERATUREFields inherited from interface IPromptQueryConfig_TextParameters
ATTR_TEXT_PARAMETERSFields inherited from interface IPromptQueryConfig_UserPrompt
ATTR_USER_PROMPT -
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionGets the cumulative probability threshold for the prompt query.Gets the effort level the model should put into answering the prompt, e.g.getFiles()Gets the configuration for the files needed by the prompt query.Gets the frequency penalty for the prompt query.Gets the maximum number of tokens the model should generate in its response to the prompt.Gets the JSON schema output for the prompt query.Gets the presence penalty for the prompt query.Gets the system prompt for the prompt query.getTask()Gets the task to be performed by the prompt service.Gets the temperature for the prompt query.Gets the list of text parameters accepted by the prompt query.Gets the user prompt for the prompt query.voidsetCumulativeProbabilityThreshold(Double cumulativeProbabilityThreshold) Sets the cumulative probability threshold for the prompt query.voidsetEffortLevel(String effortLevel) Sets the effort level the model should put into answering the prompt, e.g.voidsetFiles(PromptFiles files) Sets the configuration for the files needed by the prompt query.voidsetFrequencyPenalty(Double frequencyPenalty) Sets the frequency penalty for the prompt query.voidsetGeneratedTokenLimit(Long generatedTokenLimit) Sets the maximum number of tokens the model should generate in its response to the prompt.voidsetJsonSchemaOutput(JsonSchemaOutput jsonSchemaOutput) Sets the JSON schema output for the prompt query.voidsetPresencePenalty(Double presencePenalty) Sets the presence penalty for the prompt query.voidsetSystemPrompt(SystemPrompt systemPrompt) Sets the system prompt for the prompt query.voidSets the task to be performed by the prompt service.voidsetTemperature(Double temperature) Sets the temperature for the prompt query.voidsetTextParameters(List<PromptTextParameter> textParameters) Sets the list of text parameters accepted by the prompt query.voidsetUserPrompt(String userPrompt) Sets the user prompt for the prompt query.
-
Constructor Details
-
GenericOaiQueryConfig
public GenericOaiQueryConfig()
-
-
Method Details
-
getCumulativeProbabilityThreshold
Description copied from interface:IPromptQueryConfig_CumulativeProbabilityThresholdGets the cumulative probability threshold for the prompt query. This is a value that controls the diversity of the output by limiting the selection of tokens to a subset whose cumulative probability exceeds the top-p value.- Specified by:
getCumulativeProbabilityThresholdin interfaceIPromptQueryConfig_CumulativeProbabilityThreshold- Returns:
- The top probability.
-
setCumulativeProbabilityThreshold
Description copied from interface:IPromptQueryConfig_CumulativeProbabilityThresholdSets the cumulative probability threshold for the prompt query. This is a value that controls the diversity of the output by limiting the selection of tokens to a subset whose cumulative probability exceeds the top-p value.- Specified by:
setCumulativeProbabilityThresholdin interfaceIPromptQueryConfig_CumulativeProbabilityThreshold- Parameters:
cumulativeProbabilityThreshold- The top probability to set.
-
getEffortLevel
Description copied from interface:IPromptQueryConfig_EffortLevelGets the effort level the model should put into answering the prompt, e.g. "low", "medium", or "high".- Specified by:
getEffortLevelin interfaceIPromptQueryConfig_EffortLevel- Returns:
- The effort level.
-
setEffortLevel
Description copied from interface:IPromptQueryConfig_EffortLevelSets the effort level the model should put into answering the prompt, e.g. "low", "medium", or "high".- Specified by:
setEffortLevelin interfaceIPromptQueryConfig_EffortLevel- Parameters:
effortLevel- The effort level.
-
getFiles
Description copied from interface:IPromptQueryConfig_FilesGets the configuration for the files needed by the prompt query.- Specified by:
getFilesin interfaceIPromptQueryConfig_Files- Returns:
- The configuration for the files.
-
setFiles
Description copied from interface:IPromptQueryConfig_FilesSets the configuration for the files needed by the prompt query.- Specified by:
setFilesin interfaceIPromptQueryConfig_Files- Parameters:
files- The configuration for the files.
-
getFrequencyPenalty
Description copied from interface:IPromptQueryConfig_FrequencyPenaltyGets the frequency penalty for the prompt query. The frequency penalty is a value that penalizes new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.- Specified by:
getFrequencyPenaltyin interfaceIPromptQueryConfig_FrequencyPenalty- Returns:
- The frequency penalty.
-
setFrequencyPenalty
Description copied from interface:IPromptQueryConfig_FrequencyPenaltySets the frequency penalty for the prompt query. The frequency penalty is a value that penalizes new tokens- Specified by:
setFrequencyPenaltyin interfaceIPromptQueryConfig_FrequencyPenalty- Parameters:
frequencyPenalty- The frequency penalty to set.
-
getGeneratedTokenLimit
Description copied from interface:IPromptQueryConfig_GeneratedTokenLimitGets the maximum number of tokens the model should generate in its response to the prompt.- Specified by:
getGeneratedTokenLimitin interfaceIPromptQueryConfig_GeneratedTokenLimit- Returns:
- The maximum number of generated tokens.
-
setGeneratedTokenLimit
Description copied from interface:IPromptQueryConfig_GeneratedTokenLimitSets the maximum number of tokens the model should generate in its response to the prompt.- Specified by:
setGeneratedTokenLimitin interfaceIPromptQueryConfig_GeneratedTokenLimit- Parameters:
generatedTokenLimit- The maximum number of generated tokens to set.
-
getJsonSchemaOutput
Description copied from interface:IPromptQueryConfig_JsonSchemaOutputGets the JSON schema output for the prompt query. The prompt provider will ensure that the response conforms to the configured JSON schema.- Specified by:
getJsonSchemaOutputin interfaceIPromptQueryConfig_JsonSchemaOutput- Returns:
- The JSON schema output
-
setJsonSchemaOutput
Description copied from interface:IPromptQueryConfig_JsonSchemaOutputSets the JSON schema output for the prompt query. The prompt provider will ensure that the response conforms to the configured JSON schema.- Specified by:
setJsonSchemaOutputin interfaceIPromptQueryConfig_JsonSchemaOutput- Parameters:
jsonSchemaOutput- The JSON schema output to set
-
getPresencePenalty
Description copied from interface:IPromptQueryConfig_PresencePenaltyGets the presence penalty for the prompt query. The presence penalty is a value that penalizes new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.- Specified by:
getPresencePenaltyin interfaceIPromptQueryConfig_PresencePenalty- Returns:
- The presence penalty.
-
setPresencePenalty
Description copied from interface:IPromptQueryConfig_PresencePenaltySets the presence penalty for the prompt query. The presence penalty is a value that penalizes new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.- Specified by:
setPresencePenaltyin interfaceIPromptQueryConfig_PresencePenalty- Parameters:
presencePenalty- The presence penalty to set.
-
getSystemPrompt
Description copied from interface:IPromptQueryConfig_SystemPromptGets the system prompt for the prompt query. The system prompt is a message that provides context or instructions to the query service, and it is given more weight than a user prompt.- Specified by:
getSystemPromptin interfaceIPromptQueryConfig_SystemPrompt- Returns:
- The system prompt.
-
setSystemPrompt
Description copied from interface:IPromptQueryConfig_SystemPromptSets the system prompt for the prompt query. The system prompt is a message that provides context or instructions to the query service, and it is given more weight than a user prompt.- Specified by:
setSystemPromptin interfaceIPromptQueryConfig_SystemPrompt- Parameters:
systemPrompt- The system prompt to set.
-
getTask
Gets the task to be performed by the prompt service. Currently supported tasks are:- Returns:
- The task.
-
setTask
Sets the task to be performed by the prompt service. Currently supported tasks are:- Parameters:
task- The task.
-
getTemperature
Description copied from interface:IPromptQueryConfig_TemperatureGets the temperature for the prompt query. The temperature is a value that controls the randomness of the output. Usually higher values will make the output more random, while lower values will make it more focused and deterministic.- Specified by:
getTemperaturein interfaceIPromptQueryConfig_Temperature- Returns:
- The temperature.
-
setTemperature
Description copied from interface:IPromptQueryConfig_TemperatureSets the temperature for the prompt query. The temperature is a value that controls the randomness of the output. Usually higher values will make the output more random, while lower values will make it more focused and deterministic.- Specified by:
setTemperaturein interfaceIPromptQueryConfig_Temperature- Parameters:
temperature- The temperature to set.
-
getTextParameters
Description copied from interface:IPromptQueryConfig_TextParametersGets the list of text parameters accepted by the prompt query.- Specified by:
getTextParametersin interfaceIPromptQueryConfig_TextParameters- Returns:
- The list of text parameters.
-
setTextParameters
Description copied from interface:IPromptQueryConfig_TextParametersSets the list of text parameters accepted by the prompt query.- Specified by:
setTextParametersin interfaceIPromptQueryConfig_TextParameters- Parameters:
textParameters- The list of text parameters to set.
-
getUserPrompt
Description copied from interface:IPromptQueryConfig_UserPromptGets the user prompt for the prompt query. The user prompt is the message to which the query service should respond.- Specified by:
getUserPromptin interfaceIPromptQueryConfig_UserPrompt- Returns:
- The user prompt.
-
setUserPrompt
Description copied from interface:IPromptQueryConfig_UserPromptSets the user prompt for the prompt query. The user prompt is the message to which the query service should respond.- Specified by:
setUserPromptin interfaceIPromptQueryConfig_UserPrompt- Parameters:
userPrompt- The user prompt to set.
-