Starting April 29, 2025, Gemini 1.5 Pro and Gemini 1.5 Flash models are not available in projects that have no prior usage of these models, including new projects. For details, see Model versions and lifecycle.
You can also customize the agent's behavior beyond input by passing additional keyword arguments to query().
response=agent.query(input={"input"=["What is Paul Graham's life in college?","How did Paul Graham's college experience shape his career?","How did Paul Graham's college experience shape his entrepreneurial mindset?",],},batch=True# run the pipeline in batch mode and pass a list of inputs.)print(response)
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-29 UTC."],[],[],null,["# Use a LlamaIndex Query Pipeline agent\n\n| **Preview**\n|\n|\n| This feature is subject to the \"Pre-GA Offerings Terms\" in the General Service Terms section\n| of the [Service Specific Terms](/terms/service-terms#1).\n|\n| Pre-GA features are available \"as is\" and might have limited support.\n|\n| For more information, see the\n| [launch stage descriptions](/products#product-launch-stages).\n\nIn addition to the general instructions for [using an agent](/vertex-ai/generative-ai/docs/agent-engine/use),\nthis page describes features that are specific to `LlamaIndexQueryPipelineAgent`.\n\nBefore you begin\n----------------\n\nThis tutorial assumes that you have read and followed the instructions in:\n\n- [Develop a LlamaIndexQueryPipeline agent](/vertex-ai/generative-ai/docs/agent-engine/develop/llama-index/query-pipeline): to develop `agent` as an instance of `LlamaIndexQueryPipelineAgent`.\n- [User authentication](/vertex-ai/generative-ai/docs/agent-engine/set-up#authentication) to authenticate as a user for querying the agent.\n\nSupported operations\n--------------------\n\nThe following operations are supported for `LlamaIndexQueryPipelineAgent`:\n\n- [`query`](/vertex-ai/generative-ai/docs/agent-engine/use#query-agent): for getting a response to a query synchronously.\n\nThe `query` method supports the following type of argument:\n\n- [`input`](#input-messages): the messages to be sent to the agent.\n\nQuery the agent\n---------------\n\nThe command: \n\n agent.query(input=\"What is Paul Graham's life in college?\")\n\nis equivalent to the following (in full form): \n\n agent.query(input={\"input\": \"What is Paul Graham's life in college?\"})\n\nTo customize the input dictionary, see\n[Customize the prompt template](/vertex-ai/generative-ai/docs/agent-engine/develop/llama-index/query-pipeline#prompt-template).\n\nYou can also customize the agent's behavior beyond `input` by passing additional keyword arguments to `query()`. \n\n response = agent.query(\n input={\n \"input\" = [\n \"What is Paul Graham's life in college?\",\n \"How did Paul Graham's college experience shape his career?\",\n \"How did Paul Graham's college experience shape his entrepreneurial mindset?\",\n ],\n },\n batch=True # run the pipeline in batch mode and pass a list of inputs.\n )\n print(response)\n\nSee the [`QueryPipeline.run` code](https://github.com/run-llama/llama_index/blob/main/llama-index-core/llama_index/core/query_pipeline/query.py#L392) for a complete list of available parameters.\n\nWhat's next\n-----------\n\n- [Use an agent](/vertex-ai/generative-ai/docs/agent-engine/use).\n- [Evaluate an agent](/vertex-ai/generative-ai/docs/agent-engine/evaluate).\n- [Manage deployed agents](/vertex-ai/generative-ai/docs/agent-engine/manage).\n- [Get support](/vertex-ai/generative-ai/docs/agent-engine/support)."]]