Send feedback
Class SubjectReferenceImage (1.95.1)
Stay organized with collections
Save and categorize content based on your preferences.
Version latestkeyboard_arrow_down
SubjectReferenceImage (
reference_id ,
image : typing . Optional [
typing . Union [ bytes , vertexai . vision_models . Image , str ]
] = None ,
subject_description : typing . Optional [ str ] = None ,
subject_type : typing . Optional [
typing . Literal [ "default" , "person" , "animal" , "product" ]
] = None ,
)
Subject reference image.
This encapsulates the subject reference image type.
Methods
SubjectReferenceImage
SubjectReferenceImage (
reference_id ,
image : typing . Optional [
typing . Union [ bytes , vertexai . vision_models . Image , str ]
] = None ,
subject_description : typing . Optional [ str ] = None ,
subject_type : typing . Optional [
typing . Literal [ "default" , "person" , "animal" , "product" ]
] = None ,
)
Creates a SubjectReferenceImage
object.
Parameters
Name
Description
image
typing.Union[bytes, vertexai.vision_models.Image , str, NoneType]
Either Image object or Image file bytes. Image can be in PNG or JPEG format.
subject_description
typing.Optional[str]
Subject description for the image.
subject_type
typing.Optional[typing.Literal['default', 'person', 'animal', 'product']]
Subject type for the image. Can take the following values: * default: Default subject type * person: Person subject type * animal: Animal subject type * product: Product subject type
Send feedback
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License , and code samples are licensed under the Apache 2.0 License . For details, see the Google Developers Site Policies . Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-08-07 UTC.
Need to tell us more?
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[],[],null,["# Class SubjectReferenceImage (1.95.1)\n\nVersion latestkeyboard_arrow_down\n\n- [1.95.1 (latest)](/python/docs/reference/vertexai/latest/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.94.0](/python/docs/reference/vertexai/1.94.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.93.1](/python/docs/reference/vertexai/1.93.1/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.92.0](/python/docs/reference/vertexai/1.92.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.91.0](/python/docs/reference/vertexai/1.91.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.90.0](/python/docs/reference/vertexai/1.90.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.89.0](/python/docs/reference/vertexai/1.89.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.88.0](/python/docs/reference/vertexai/1.88.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.87.0](/python/docs/reference/vertexai/1.87.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.86.0](/python/docs/reference/vertexai/1.86.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.85.0](/python/docs/reference/vertexai/1.85.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.84.0](/python/docs/reference/vertexai/1.84.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.83.0](/python/docs/reference/vertexai/1.83.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.82.0](/python/docs/reference/vertexai/1.82.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.81.0](/python/docs/reference/vertexai/1.81.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.80.0](/python/docs/reference/vertexai/1.80.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.79.0](/python/docs/reference/vertexai/1.79.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.78.0](/python/docs/reference/vertexai/1.78.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.77.0](/python/docs/reference/vertexai/1.77.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.76.0](/python/docs/reference/vertexai/1.76.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.75.0](/python/docs/reference/vertexai/1.75.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.74.0](/python/docs/reference/vertexai/1.74.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.73.0](/python/docs/reference/vertexai/1.73.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.72.0](/python/docs/reference/vertexai/1.72.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.71.1](/python/docs/reference/vertexai/1.71.1/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.70.0](/python/docs/reference/vertexai/1.70.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.69.0](/python/docs/reference/vertexai/1.69.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.68.0](/python/docs/reference/vertexai/1.68.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.67.1](/python/docs/reference/vertexai/1.67.1/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.66.0](/python/docs/reference/vertexai/1.66.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.65.0](/python/docs/reference/vertexai/1.65.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.63.0](/python/docs/reference/vertexai/1.63.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.62.0](/python/docs/reference/vertexai/1.62.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.60.0](/python/docs/reference/vertexai/1.60.0/vertexai.preview.vision_models.SubjectReferenceImage)\n- [1.59.0](/python/docs/reference/vertexai/1.59.0/vertexai.preview.vision_models.SubjectReferenceImage) \n\n SubjectReferenceImage(\n reference_id,\n image: typing.Optional[\n typing.Union[bytes, vertexai.vision_models.Image, str]\n ] = None,\n subject_description: typing.Optional[str] = None,\n subject_type: typing.Optional[\n typing.Literal[\"default\", \"person\", \"animal\", \"product\"]\n ] = None,\n )\n\nSubject reference image.\n\nThis encapsulates the subject reference image type.\n\nMethods\n-------\n\n### SubjectReferenceImage\n\n SubjectReferenceImage(\n reference_id,\n image: typing.Optional[\n typing.Union[bytes, vertexai.vision_models.Image, str]\n ] = None,\n subject_description: typing.Optional[str] = None,\n subject_type: typing.Optional[\n typing.Literal[\"default\", \"person\", \"animal\", \"product\"]\n ] = None,\n )\n\nCreates a `SubjectReferenceImage` object."]]