Enum LlmModelSettings.Parameters.InputTokenLimit (0.89.0)

public enum LlmModelSettings.Parameters.InputTokenLimit extends Enum<LlmModelSettings.Parameters.InputTokenLimit> implements ProtocolMessageEnum

The input token limits for 1 LLM call. For the limit of each model, see https://cloud.google.com/vertex-ai/generative-ai/docs/learn/models for more information.

Protobuf enum google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.InputTokenLimit

Implements

ProtocolMessageEnum

Static Fields

Name Description
INPUT_TOKEN_LIMIT_LONG

Input token limit up to 100k.

INPUT_TOKEN_LIMIT_LONG = 3;

INPUT_TOKEN_LIMIT_LONG_VALUE

Input token limit up to 100k.

INPUT_TOKEN_LIMIT_LONG = 3;

INPUT_TOKEN_LIMIT_MEDIUM

Input token limit up to 32k.

INPUT_TOKEN_LIMIT_MEDIUM = 2;

INPUT_TOKEN_LIMIT_MEDIUM_VALUE

Input token limit up to 32k.

INPUT_TOKEN_LIMIT_MEDIUM = 2;

INPUT_TOKEN_LIMIT_SHORT

Input token limit up to 8k.

INPUT_TOKEN_LIMIT_SHORT = 1;

INPUT_TOKEN_LIMIT_SHORT_VALUE

Input token limit up to 8k.

INPUT_TOKEN_LIMIT_SHORT = 1;

INPUT_TOKEN_LIMIT_UNSPECIFIED

Limit not specified. Treated as 'INPUT_TOKEN_LIMIT_SHORT'.

INPUT_TOKEN_LIMIT_UNSPECIFIED = 0;

INPUT_TOKEN_LIMIT_UNSPECIFIED_VALUE

Limit not specified. Treated as 'INPUT_TOKEN_LIMIT_SHORT'.

INPUT_TOKEN_LIMIT_UNSPECIFIED = 0;

UNRECOGNIZED

Static Methods

Name Description
forNumber(int value)
getDescriptor()
internalGetValueMap()
valueOf(Descriptors.EnumValueDescriptor desc)
valueOf(int value)

Deprecated. Use #forNumber(int) instead.

valueOf(String name)
values()

Methods

Name Description
getDescriptorForType()
getNumber()
getValueDescriptor()