Class LlmModelSettings.Parameters.Builder (0.89.0)

public static final class LlmModelSettings.Parameters.Builder extends GeneratedMessageV3.Builder<LlmModelSettings.Parameters.Builder> implements LlmModelSettings.ParametersOrBuilder

Generative model parameters to control the model behavior.

Protobuf type google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters

Static Methods

getDescriptor()

public static final Descriptors.Descriptor getDescriptor()
Returns
Type Description
Descriptor

Methods

addRepeatedField(Descriptors.FieldDescriptor field, Object value)

public LlmModelSettings.Parameters.Builder addRepeatedField(Descriptors.FieldDescriptor field, Object value)
Parameters
Name Description
field FieldDescriptor
value Object
Returns
Type Description
LlmModelSettings.Parameters.Builder
Overrides

build()

public LlmModelSettings.Parameters build()
Returns
Type Description
LlmModelSettings.Parameters

buildPartial()

public LlmModelSettings.Parameters buildPartial()
Returns
Type Description
LlmModelSettings.Parameters

clear()

public LlmModelSettings.Parameters.Builder clear()
Returns
Type Description
LlmModelSettings.Parameters.Builder
Overrides

clearField(Descriptors.FieldDescriptor field)

public LlmModelSettings.Parameters.Builder clearField(Descriptors.FieldDescriptor field)
Parameter
Name Description
field FieldDescriptor
Returns
Type Description
LlmModelSettings.Parameters.Builder
Overrides

clearInputTokenLimit()

public LlmModelSettings.Parameters.Builder clearInputTokenLimit()

The input token limit. This setting is currently only supported by playbooks.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.InputTokenLimit input_token_limit = 2;

Returns
Type Description
LlmModelSettings.Parameters.Builder

This builder for chaining.

clearOneof(Descriptors.OneofDescriptor oneof)

public LlmModelSettings.Parameters.Builder clearOneof(Descriptors.OneofDescriptor oneof)
Parameter
Name Description
oneof OneofDescriptor
Returns
Type Description
LlmModelSettings.Parameters.Builder
Overrides

clearOutputTokenLimit()

public LlmModelSettings.Parameters.Builder clearOutputTokenLimit()

The output token limit. This setting is currently only supported by playbooks. Only one of output_token_limit and max_output_tokens is allowed to be set.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.OutputTokenLimit output_token_limit = 3;

Returns
Type Description
LlmModelSettings.Parameters.Builder

This builder for chaining.

clearTemperature()

public LlmModelSettings.Parameters.Builder clearTemperature()

The temperature used for sampling during response generation. Value ranges from 0 to 1. Temperature controls the degree of randomness in token selection. Lower temperature means less randomness, while higher temperature means more randomness. Valid range: [0.0, 1.0]

optional float temperature = 1;

Returns
Type Description
LlmModelSettings.Parameters.Builder

This builder for chaining.

clone()

public LlmModelSettings.Parameters.Builder clone()
Returns
Type Description
LlmModelSettings.Parameters.Builder
Overrides

getDefaultInstanceForType()

public LlmModelSettings.Parameters getDefaultInstanceForType()
Returns
Type Description
LlmModelSettings.Parameters

getDescriptorForType()

public Descriptors.Descriptor getDescriptorForType()
Returns
Type Description
Descriptor
Overrides

getInputTokenLimit()

public LlmModelSettings.Parameters.InputTokenLimit getInputTokenLimit()

The input token limit. This setting is currently only supported by playbooks.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.InputTokenLimit input_token_limit = 2;

Returns
Type Description
LlmModelSettings.Parameters.InputTokenLimit

The inputTokenLimit.

getInputTokenLimitValue()

public int getInputTokenLimitValue()

The input token limit. This setting is currently only supported by playbooks.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.InputTokenLimit input_token_limit = 2;

Returns
Type Description
int

The enum numeric value on the wire for inputTokenLimit.

getOutputTokenLimit()

public LlmModelSettings.Parameters.OutputTokenLimit getOutputTokenLimit()

The output token limit. This setting is currently only supported by playbooks. Only one of output_token_limit and max_output_tokens is allowed to be set.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.OutputTokenLimit output_token_limit = 3;

Returns
Type Description
LlmModelSettings.Parameters.OutputTokenLimit

The outputTokenLimit.

getOutputTokenLimitValue()

public int getOutputTokenLimitValue()

The output token limit. This setting is currently only supported by playbooks. Only one of output_token_limit and max_output_tokens is allowed to be set.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.OutputTokenLimit output_token_limit = 3;

Returns
Type Description
int

The enum numeric value on the wire for outputTokenLimit.

getTemperature()

public float getTemperature()

The temperature used for sampling during response generation. Value ranges from 0 to 1. Temperature controls the degree of randomness in token selection. Lower temperature means less randomness, while higher temperature means more randomness. Valid range: [0.0, 1.0]

optional float temperature = 1;

Returns
Type Description
float

The temperature.

hasInputTokenLimit()

public boolean hasInputTokenLimit()

The input token limit. This setting is currently only supported by playbooks.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.InputTokenLimit input_token_limit = 2;

Returns
Type Description
boolean

Whether the inputTokenLimit field is set.

hasOutputTokenLimit()

public boolean hasOutputTokenLimit()

The output token limit. This setting is currently only supported by playbooks. Only one of output_token_limit and max_output_tokens is allowed to be set.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.OutputTokenLimit output_token_limit = 3;

Returns
Type Description
boolean

Whether the outputTokenLimit field is set.

hasTemperature()

public boolean hasTemperature()

The temperature used for sampling during response generation. Value ranges from 0 to 1. Temperature controls the degree of randomness in token selection. Lower temperature means less randomness, while higher temperature means more randomness. Valid range: [0.0, 1.0]

optional float temperature = 1;

Returns
Type Description
boolean

Whether the temperature field is set.

internalGetFieldAccessorTable()

protected GeneratedMessageV3.FieldAccessorTable internalGetFieldAccessorTable()
Returns
Type Description
FieldAccessorTable
Overrides

isInitialized()

public final boolean isInitialized()
Returns
Type Description
boolean
Overrides

mergeFrom(LlmModelSettings.Parameters other)

public LlmModelSettings.Parameters.Builder mergeFrom(LlmModelSettings.Parameters other)
Parameter
Name Description
other LlmModelSettings.Parameters
Returns
Type Description
LlmModelSettings.Parameters.Builder

mergeFrom(CodedInputStream input, ExtensionRegistryLite extensionRegistry)

public LlmModelSettings.Parameters.Builder mergeFrom(CodedInputStream input, ExtensionRegistryLite extensionRegistry)
Parameters
Name Description
input CodedInputStream
extensionRegistry ExtensionRegistryLite
Returns
Type Description
LlmModelSettings.Parameters.Builder
Overrides
Exceptions
Type Description
IOException

mergeFrom(Message other)

public LlmModelSettings.Parameters.Builder mergeFrom(Message other)
Parameter
Name Description
other Message
Returns
Type Description
LlmModelSettings.Parameters.Builder
Overrides

mergeUnknownFields(UnknownFieldSet unknownFields)

public final LlmModelSettings.Parameters.Builder mergeUnknownFields(UnknownFieldSet unknownFields)
Parameter
Name Description
unknownFields UnknownFieldSet
Returns
Type Description
LlmModelSettings.Parameters.Builder
Overrides

setField(Descriptors.FieldDescriptor field, Object value)

public LlmModelSettings.Parameters.Builder setField(Descriptors.FieldDescriptor field, Object value)
Parameters
Name Description
field FieldDescriptor
value Object
Returns
Type Description
LlmModelSettings.Parameters.Builder
Overrides

setInputTokenLimit(LlmModelSettings.Parameters.InputTokenLimit value)

public LlmModelSettings.Parameters.Builder setInputTokenLimit(LlmModelSettings.Parameters.InputTokenLimit value)

The input token limit. This setting is currently only supported by playbooks.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.InputTokenLimit input_token_limit = 2;

Parameter
Name Description
value LlmModelSettings.Parameters.InputTokenLimit

The inputTokenLimit to set.

Returns
Type Description
LlmModelSettings.Parameters.Builder

This builder for chaining.

setInputTokenLimitValue(int value)

public LlmModelSettings.Parameters.Builder setInputTokenLimitValue(int value)

The input token limit. This setting is currently only supported by playbooks.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.InputTokenLimit input_token_limit = 2;

Parameter
Name Description
value int

The enum numeric value on the wire for inputTokenLimit to set.

Returns
Type Description
LlmModelSettings.Parameters.Builder

This builder for chaining.

setOutputTokenLimit(LlmModelSettings.Parameters.OutputTokenLimit value)

public LlmModelSettings.Parameters.Builder setOutputTokenLimit(LlmModelSettings.Parameters.OutputTokenLimit value)

The output token limit. This setting is currently only supported by playbooks. Only one of output_token_limit and max_output_tokens is allowed to be set.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.OutputTokenLimit output_token_limit = 3;

Parameter
Name Description
value LlmModelSettings.Parameters.OutputTokenLimit

The outputTokenLimit to set.

Returns
Type Description
LlmModelSettings.Parameters.Builder

This builder for chaining.

setOutputTokenLimitValue(int value)

public LlmModelSettings.Parameters.Builder setOutputTokenLimitValue(int value)

The output token limit. This setting is currently only supported by playbooks. Only one of output_token_limit and max_output_tokens is allowed to be set.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.OutputTokenLimit output_token_limit = 3;

Parameter
Name Description
value int

The enum numeric value on the wire for outputTokenLimit to set.

Returns
Type Description
LlmModelSettings.Parameters.Builder

This builder for chaining.

setRepeatedField(Descriptors.FieldDescriptor field, int index, Object value)

public LlmModelSettings.Parameters.Builder setRepeatedField(Descriptors.FieldDescriptor field, int index, Object value)
Parameters
Name Description
field FieldDescriptor
index int
value Object
Returns
Type Description
LlmModelSettings.Parameters.Builder
Overrides

setTemperature(float value)

public LlmModelSettings.Parameters.Builder setTemperature(float value)

The temperature used for sampling during response generation. Value ranges from 0 to 1. Temperature controls the degree of randomness in token selection. Lower temperature means less randomness, while higher temperature means more randomness. Valid range: [0.0, 1.0]

optional float temperature = 1;

Parameter
Name Description
value float

The temperature to set.

Returns
Type Description
LlmModelSettings.Parameters.Builder

This builder for chaining.

setUnknownFields(UnknownFieldSet unknownFields)

public final LlmModelSettings.Parameters.Builder setUnknownFields(UnknownFieldSet unknownFields)
Parameter
Name Description
unknownFields UnknownFieldSet
Returns
Type Description
LlmModelSettings.Parameters.Builder
Overrides