API documentation for prompts
module.
Classes
Prompt
Prompt(
prompt_data: typing.Optional[
typing.Union[
str,
vertexai.generative_models._generative_models.Image,
vertexai.generative_models._generative_models.Part,
typing.List[
typing.Union[
str,
vertexai.generative_models._generative_models.Image,
vertexai.generative_models._generative_models.Part,
]
],
]
] = None,
*,
variables: typing.Optional[
typing.List[
typing.Dict[
str,
typing.Union[
str,
vertexai.generative_models._generative_models.Image,
vertexai.generative_models._generative_models.Part,
typing.List[
typing.Union[
str,
vertexai.generative_models._generative_models.Image,
vertexai.generative_models._generative_models.Part,
]
],
],
]
]
] = None,
prompt_name: typing.Optional[str] = None,
generation_config: typing.Optional[
vertexai.generative_models._generative_models.GenerationConfig
] = None,
model_name: typing.Optional[str] = None,
safety_settings: typing.Optional[
vertexai.generative_models._generative_models.SafetySetting
] = None,
system_instruction: typing.Optional[
typing.Union[
str,
vertexai.generative_models._generative_models.Image,
vertexai.generative_models._generative_models.Part,
typing.List[
typing.Union[
str,
vertexai.generative_models._generative_models.Image,
vertexai.generative_models._generative_models.Part,
]
],
]
] = None,
tools: typing.Optional[
typing.List[vertexai.generative_models._generative_models.Tool]
] = None,
tool_config: typing.Optional[
vertexai.generative_models._generative_models.ToolConfig
] = None
)
A prompt which may be a template with variables.
The Prompt
class allows users to define a template string with
variables represented in curly braces {variable}
. The variable
name must be a valid Python variable name (no spaces, must start with a
letter). These placeholders can be replaced with specific values using the
assemble_contents
method, providing flexibility in generating dynamic prompts.
Usage: Generate content from a single set of variables:
prompt = Prompt(
prompt_data="Hello, {name}! Today is {day}. How are you?",
variables=[{"name": "Alice", "day": "Monday"}]
generation_config=GenerationConfig(
temperature=0.1,
top_p=0.95,
top_k=20,
candidate_count=1,
max_output_tokens=100,
),
model_name="gemini-1.0-pro-002",
safety_settings=[SafetySetting(
category=SafetySetting.HarmCategory.HARM_CATEGORY_DANGEROUS_CONTENT,
threshold=SafetySetting.HarmBlockThreshold.BLOCK_MEDIUM_AND_ABOVE,
method=SafetySetting.HarmBlockMethod.SEVERITY,
)],
system_instruction="Please answer in a short sentence.",
)
# Generate content using the assembled prompt.
prompt.generate_content(
contents=prompt.assemble_contents(**prompt.variables)
)
```
Generate content with multiple sets of variables:
```
prompt = Prompt(
prompt_data="Hello, {name}! Today is {day}. How are you?",
variables=[
{"name": "Alice", "day": "Monday"},
{"name": "Bob", "day": "Tuesday"},
],
generation_config=GenerationConfig(
temperature=0.1,
top_p=0.95,
top_k=20,
candidate_count=1,
max_output_tokens=100,
),
model_name="gemini-1.0-pro-002",
safety_settings=[SafetySetting(
category=SafetySetting.HarmCategory.HARM_CATEGORY_DANGEROUS_CONTENT,
threshold=SafetySetting.HarmBlockThreshold.BLOCK_MEDIUM_AND_ABOVE,
method=SafetySetting.HarmBlockMethod.SEVERITY,
)],
system_instruction="Please answer in a short sentence.",
)
# Generate content using the assembled prompt for each variable set.
for i in range(len(prompt.variables)):
prompt.generate_content(
contents=prompt.assemble_contents(**prompt.variables[i])
)
```
Modules Functions
create_version
create_version(
prompt: vertexai.prompts._prompts.Prompt,
prompt_id: typing.Optional[str] = None,
version_name: typing.Optional[str] = None,
) -> vertexai.prompts._prompts.Prompt
Creates a Prompt or Prompt Version in the online prompt store
delete
delete(prompt_id: str) -> None
Deletes the online prompt resource associated with the prompt id.
get
get(
prompt_id: str, version_id: typing.Optional[str] = None
) -> vertexai.prompts._prompts.Prompt
Creates a Prompt object from an online resource.
list
list() -> list[vertexai.prompts._prompt_management.PromptMetadata]
Lists all prompt resources in the online prompt store associated with the project.
list_versions
list_versions(
prompt_id: str,
) -> list[vertexai.prompts._prompt_management.PromptVersionMetadata]
Returns a list of PromptVersionMetadata objects for the prompt resource.
restore_version
restore_version(
prompt_id: str, version_id: str
) -> vertexai.prompts._prompt_management.PromptVersionMetadata
Restores a previous version of the prompt resource and loads that version into the current Prompt object.