Package types (0.2.0)

API documentation for dialogflowcx_v3beta1.types package.

Classes

Agent

Agents are best described as Natural Language Understanding (NLU) modules that transform user requests into actionable data. You can include agents in your app, product, or service to determine user intent and respond to the user in a natural way.

After you create an agent, you can add Intents, [Entity Types][google.cloud.dialogflow.cx.v3beta1.EntityType], Flows, Fulfillments, Webhooks, and so on to manage the conversation flows..

AudioInput

Represents the natural speech audio to be processed.

CreateAgentRequest

The request message for Agents.CreateAgent.

CreateEntityTypeRequest

The request message for EntityTypes.CreateEntityType.

CreateEnvironmentRequest

The request message for Environments.CreateEnvironment.

CreateFlowRequest

The request message for Flows.CreateFlow.

CreateIntentRequest

The request message for Intents.CreateIntent.

CreatePageRequest

The request message for Pages.CreatePage.

CreateSessionEntityTypeRequest

The request message for SessionEntityTypes.CreateSessionEntityType.

CreateTransitionRouteGroupRequest

The request message for TransitionRouteGroups.CreateTransitionRouteGroup.

CreateVersionOperationMetadata

Metadata associated with the long running operation for Versions.CreateVersion.

CreateVersionRequest

The request message for Versions.CreateVersion.

CreateWebhookRequest

The request message for Webhooks.CreateWebhook.

DeleteAgentRequest

The request message for Agents.DeleteAgent.

DeleteEntityTypeRequest

The request message for EntityTypes.DeleteEntityType.

DeleteEnvironmentRequest

The request message for Environments.DeleteEnvironment.

DeleteFlowRequest

The request message for Flows.DeleteFlow.

DeleteIntentRequest

The request message for Intents.DeleteIntent.

DeletePageRequest

The request message for Pages.DeletePage.

DeleteSessionEntityTypeRequest

The request message for SessionEntityTypes.DeleteSessionEntityType.

DeleteTransitionRouteGroupRequest

The request message for TransitionRouteGroups.DeleteTransitionRouteGroup.

DeleteVersionRequest

The request message for Versions.DeleteVersion.

DeleteWebhookRequest

The request message for Webhooks.DeleteWebhook.

DetectIntentRequest

The request to detect user's intent.

DetectIntentResponse

The message returned from the DetectIntent method.

DtmfInput

Represents the input for dtmf event.

EntityType

Entities are extracted from user input and represent parameters that are meaningful to your application. For example, a date range, a proper name such as a geographic location or landmark, and so on. Entities represent actionable data for your application.

When you define an entity, you can also include synonyms that all map to that entity. For example, "soft drink", "soda", "pop", and so on.

There are three types of entities:

  • System - entities that are defined by the Dialogflow API for common data types such as date, time, currency, and so on. A system entity is represented by the EntityType type.

  • Custom - entities that are defined by you that represent actionable data that is meaningful to your application. For example, you could define a pizza.sauce entity for red or white pizza sauce, a pizza.cheese entity for the different types of cheese on a pizza, a pizza.topping entity for different toppings, and so on. A custom entity is represented by the EntityType type.

  • User - entities that are built for an individual user such as favorites, preferences, playlists, and so on. A user entity is represented by the SessionEntityType type.

For more information about entity types, see the Dialogflow documentation <https://cloud.google.com/dialogflow/docs/entities-overview>__.

Environment

Represents an environment for an agent. You can create multiple versions of your agent and publish them to separate environments. When you edit an agent, you are editing the draft agent. At any point, you can save the draft agent as an agent version, which is an immutable snapshot of your agent. When you save the draft agent, it is published to the default environment. When you create agent versions, you can publish them to custom environments. You can create a variety of custom environments for testing, development, production, etc.

EventHandler

An event handler specifies an event that can be handled during a session. When the specified event happens, the following actions are taken in order:

  • If there is a [trigger_fulfillment][google.cloud.dialogflow.cx.v3beta1.EventHandler.trigger_fulfillment] associated with the event, it will be called.
  • If there is a [target_page][google.cloud.dialogflow.cx.v3beta1.EventHandler.target_page] associated with the event, the session will transition into the specified page.
  • If there is a [target_flow][google.cloud.dialogflow.cx.v3beta1.EventHandler.target_flow] associated with the event, the session will transition into the specified flow.

EventInput

Represents the event to trigger.

ExportAgentRequest

The request message for Agents.ExportAgent.

ExportAgentResponse

The response message for Agents.ExportAgent.

Flow

Flows represents the conversation flows when you build your chatbot agent. A flow consists of many pages connected by the transition routes. Conversations always start with the built-in Start Flow (with an all-0 ID). Transition routes can direct the conversation session from the current flow (parent flow) to another flow (sub flow). When the sub flow is finished, Dialogflow will bring the session back to the parent flow, where the sub flow is started.

Usually, when a transition route is followed by a matched intent, the intent will be "consumed". This means the intent won't activate more transition routes. However, when the followed transition route moves the conversation session into a different flow, the matched intent can be carried over and to be consumed in the target flow.

Form

A form is a data model that groups related parameters that can be collected from the user. The process in which the agent prompts the user and collects parameter values from the user is called form filling. A form can be added to a page. When form filling is done, the filled parameters will be written to the session.

FulfillIntentRequest

Request of [FulfillIntent][]

FulfillIntentResponse

Response of [FulfillIntent][]

Fulfillment

A fulfillment can do one or more of the following actions at the same time:

  • Generate rich message responses.
  • Set parameter values.
  • Call the webhook.

Fulfillments can be called at various stages in the Page or Form lifecycle. For example, when a DetectIntentRequest drives a session to enter a new page, the page's entry fulfillment can add a static response to the QueryResult in the returning DetectIntentResponse, call the webhook (for example, to load user data from a database), or both.

GetAgentRequest

The request message for Agents.GetAgent.

GetEntityTypeRequest

The request message for EntityTypes.GetEntityType.

GetEnvironmentRequest

The request message for Environments.GetEnvironment.

GetFlowRequest

The response message for Flows.GetFlow.

GetIntentRequest

The request message for Intents.GetIntent.

GetPageRequest

The request message for Pages.GetPage.

GetSessionEntityTypeRequest

The request message for SessionEntityTypes.GetSessionEntityType.

GetTransitionRouteGroupRequest

The request message for TransitionRouteGroups.GetTransitionRouteGroup.

GetVersionRequest

The request message for Versions.GetVersion.

GetWebhookRequest

The request message for Webhooks.GetWebhook.

InputAudioConfig

Instructs the speech recognizer on how to process the audio content.

Intent

An intent represents a user's intent to interact with a conversational agent. You can provide information for the Dialogflow API to use to match user input to an intent by adding training phrases (i.e., examples of user input) to your intent.

IntentInput

Represents the intent to trigger programmatically rather than as a result of natural language processing.

ListAgentsRequest

The request message for Agents.ListAgents.

ListAgentsResponse

The response message for Agents.ListAgents.

ListEntityTypesRequest

The request message for EntityTypes.ListEntityTypes.

ListEntityTypesResponse

The response message for EntityTypes.ListEntityTypes.

ListEnvironmentsRequest

The request message for Environments.ListEnvironments.

ListEnvironmentsResponse

The response message for Environments.ListEnvironments.

ListFlowsRequest

The request message for Flows.ListFlows.

ListFlowsResponse

The response message for Flows.ListFlows.

ListIntentsRequest

The request message for Intents.ListIntents.

ListIntentsResponse

The response message for Intents.ListIntents.

ListPagesRequest

The request message for Pages.ListPages.

ListPagesResponse

The response message for Pages.ListPages.

ListSessionEntityTypesRequest

The request message for SessionEntityTypes.ListSessionEntityTypes.

ListSessionEntityTypesResponse

The response message for SessionEntityTypes.ListSessionEntityTypes.

ListTransitionRouteGroupsRequest

The request message for TransitionRouteGroups.ListTransitionRouteGroups.

ListTransitionRouteGroupsResponse

The response message for TransitionRouteGroups.ListTransitionRouteGroups.

ListVersionsRequest

The request message for Versions.ListVersions.

ListVersionsResponse

The response message for Versions.ListVersions.

ListWebhooksRequest

The request message for Webhooks.ListWebhooks.

ListWebhooksResponse

The response message for Webhooks.ListWebhooks.

LoadVersionRequest

The request message for Versions.LoadVersion.

LookupEnvironmentHistoryRequest

The request message for Environments.LookupEnvironmentHistory.

LookupEnvironmentHistoryResponse

The response message for Environments.LookupEnvironmentHistory.

Match

Represents one match result of [MatchIntent][].

MatchIntentRequest

Request of [MatchIntent][].

MatchIntentResponse

Response of [MatchIntent][].

NluSettings

Settings related to NLU.

OutputAudioConfig

Instructs the speech synthesizer how to generate the output audio content.

Page

A Dialogflow CX conversation (session) can be described and visualized as a state machine. The states of a CX session are represented by pages.

For each flow, you define many pages, where your combined pages can handle a complete conversation on the topics the flow is designed for. At any given moment, exactly one page is the current page, the current page is considered active, and the flow associated with that page is considered active. Every flow has a special start page. When a flow initially becomes active, the start page page becomes the current page. For each conversational turn, the current page will either stay the same or transition to another page.

You configure each page to collect information from the end-user that is relevant for the conversational state represented by the page.

For more information, see the Page guide <https://cloud.google.com/dialogflow/cx/docs/concept/page>__.

PageInfo

Represents page information communicated to and from the webhook.

QueryInput

Represents the query input. It can contain one of:

  1. A conversational query in the form of text.

  2. An intent query that specifies which intent to trigger.

  3. Natural language speech audio to be processed.

  4. An event to be triggered.

QueryParameters

Represents the parameters of a conversational query.

QueryResult

Represents the result of a conversational query.

ResponseMessage

Represents a response message that can be returned by a conversational agent.

Response messages are also used for output audio synthesis. The approach is as follows:

  • If at least one OutputAudioText response is present, then all OutputAudioText responses are linearly concatenated, and the result is used for output audio synthesis.
  • If the OutputAudioText responses are a mixture of text and SSML, then the concatenated result is treated as SSML; otherwise, the result is treated as either text or SSML as appropriate. The agent designer should ideally use either text or SSML consistently throughout the bot design.
  • Otherwise, all Text responses are linearly concatenated, and the result is used for output audio synthesis.

This approach allows for more sophisticated user experience scenarios, where the text displayed to the user may differ from what is heard.

RestoreAgentRequest

The request message for Agents.RestoreAgent.

SentimentAnalysisResult

The result of sentiment analysis. Sentiment analysis inspects user input and identifies the prevailing subjective opinion, especially to determine a user's attitude as positive, negative, or neutral.

SessionEntityType

Session entity types are referred to as User entity types and are entities that are built for an individual user such as favorites, preferences, playlists, and so on.

You can redefine a session entity type at the session level to extend or replace a [custom entity type][google.cloud.dialogflow.cx.v3beta1.EntityType] at the user session level (we refer to the entity types defined at the agent level as "custom entity types").

Note: session entity types apply to all queries, regardless of the language.

For more information about entity types, see the Dialogflow documentation <https://cloud.google.com/dialogflow/docs/entities-overview>__.

SessionInfo

Represents session information communicated to and from the webhook.

SpeechToTextSettings

Settings related to speech recognition.

SpeechWordInfo

Information for a word recognized by the speech recognizer.

StreamingDetectIntentRequest

The top-level message sent by the client to the Sessions.StreamingDetectIntent method.

Multiple request messages should be sent in order:

  1. The first message must contain session, query_input plus optionally query_params. If the client wants to receive an audio response, it should also contain output_audio_config.

  2. If query_input was set to query_input.audio.config, all subsequent messages must contain query_input.audio.audio to continue with Speech recognition. If you decide to rather detect an intent from text input after you already started Speech recognition, please send a message with query_input.text.

    However, note that:

    • Dialogflow will bill you for the audio duration so far.
    • Dialogflow discards all Speech recognition results in favor of the input text.
    • Dialogflow will use the language code from the first message.

After you sent all input, you must half-close or abort the request stream.

StreamingDetectIntentResponse

The top-level message returned from the StreamingDetectIntent method.

Multiple response messages can be returned in order:

  1. If the input was set to streaming audio, the first one or more messages contain recognition_result. Each recognition_result represents a more complete transcript of what the user said. The last recognition_result has is_final set to true.

  2. The last message contains detect_intent_response.

StreamingRecognitionResult

Contains a speech recognition result corresponding to a portion of the audio that is currently being processed or an indication that this is the end of the single requested utterance.

Example:

  1. transcript: "tube"

  2. transcript: "to be a"

  3. transcript: "to be"

  4. transcript: "to be or not to be" is_final: true

  5. transcript: " that's"

  6. transcript: " that is"

  7. message_type: END_OF_SINGLE_UTTERANCE

  8. transcript: " that is the question" is_final: true

Only two of the responses contain final results (#4 and #8 indicated by is_final: true). Concatenating these generates the full transcript: "to be or not to be that is the question".

In each response we populate:

  • for TRANSCRIPT: transcript and possibly is_final.

  • for END_OF_SINGLE_UTTERANCE: only message_type.

SynthesizeSpeechConfig

Configuration of how speech should be synthesized.

TextInput

Represents the natural language text to be processed.

TrainFlowRequest

The request message for Flows.TrainFlow.

TransitionRoute

A transition route specifies a intent that can be matched and/or a data condition that can be evaluated during a session. When a specified transition is matched, the following actions are taken in order:

  • If there is a [trigger_fulfillment][google.cloud.dialogflow.cx.v3beta1.TransitionRoute.trigger_fulfillment] associated with the transition, it will be called.
  • If there is a [target_page][google.cloud.dialogflow.cx.v3beta1.TransitionRoute.target_page] associated with the transition, the session will transition into the specified page.
  • If there is a [target_flow][google.cloud.dialogflow.cx.v3beta1.TransitionRoute.target_flow] associated with the transition, the session will transition into the specified flow.

TransitionRouteGroup

An TransitionRouteGroup represents a group of [TransitionRoutes][google.cloud.dialogflow.cx.v3beta1.TransitionRoute] to be used by a Page.

UpdateAgentRequest

The request message for Agents.UpdateAgent.

UpdateEntityTypeRequest

The request message for EntityTypes.UpdateEntityType.

UpdateEnvironmentRequest

The request message for Environments.UpdateEnvironment.

UpdateFlowRequest

The request message for Flows.UpdateFlow.

UpdateIntentRequest

The request message for Intents.UpdateIntent.

UpdatePageRequest

The request message for Pages.UpdatePage.

UpdateSessionEntityTypeRequest

The request message for SessionEntityTypes.UpdateSessionEntityType.

UpdateTransitionRouteGroupRequest

The request message for TransitionRouteGroups.UpdateTransitionRouteGroup.

UpdateVersionRequest

The request message for Versions.UpdateVersion.

UpdateWebhookRequest

The request message for Webhooks.UpdateWebhook.

Version

Represents a version of a flow.

VoiceSelectionParams

Description of which voice to use for speech synthesis.

Webhook

Webhooks host the developer's business logic. During a session, webhooks allow the developer to use the data extracted by Dialogflow's natural language processing to generate dynamic responses, validate collected data, or trigger actions on the backend.

WebhookRequest

The request message for a webhook call.

WebhookResponse

The response message for a webhook call.