- 1.29.0 (latest)
- 1.28.0
- 1.27.0
- 1.26.0
- 1.25.0
- 1.24.0
- 1.22.0
- 1.21.0
- 1.20.0
- 1.19.0
- 1.18.0
- 1.17.0
- 1.16.0
- 1.15.0
- 1.14.0
- 1.13.0
- 1.12.0
- 1.11.1
- 1.10.0
- 1.9.0
- 1.8.0
- 1.7.0
- 1.6.0
- 1.5.0
- 1.4.0
- 1.3.0
- 1.2.0
- 1.1.0
- 1.0.0
- 0.26.0
- 0.25.0
- 0.24.0
- 0.23.0
- 0.22.0
- 0.21.0
- 0.20.1
- 0.19.2
- 0.18.0
- 0.17.0
- 0.16.0
- 0.15.0
- 0.14.1
- 0.13.0
- 0.12.0
- 0.11.0
- 0.10.0
- 0.9.0
- 0.8.0
- 0.7.0
- 0.6.0
- 0.5.0
- 0.4.0
- 0.3.0
- 0.2.0
PaLM2TextEmbeddingGenerator(
model_name: typing.Literal[
"textembedding-gecko", "textembedding-gecko-multilingual"
] = "textembedding-gecko",
session: typing.Optional[bigframes.session.Session] = None,
connection_name: typing.Optional[str] = None,
)
PaLM2 text embedding generator LLM model.
Parameters | |
---|---|
Name | Description |
model_name |
str, Default to "textembedding-gecko"
The model for text embedding. “textembedding-gecko” returns model embeddings for text inputs. "textembedding-gecko-multilingual" returns model embeddings for text inputs which support over 100 languages Default to "textembedding-gecko". |
session |
bigframes.Session or None
BQ session to create the model. If None, use the global default session. |
connection_name |
str or None
connection to connect with remote service. str of the format <PROJECT_NUMBER/PROJECT_ID>.
|
Methods
__repr__
__repr__()
Print the estimator's constructor with all non-default parameter values
get_params
get_params(deep: bool = True) -> typing.Dict[str, typing.Any]
Get parameters for this estimator.
Parameter | |
---|---|
Name | Description |
deep |
bool, default True
Default |
Returns | |
---|---|
Type | Description |
Dictionary | A dictionary of parameter names mapped to their values. |
predict
predict(
X: typing.Union[bigframes.dataframe.DataFrame, bigframes.series.Series]
) -> bigframes.dataframe.DataFrame
Predict the result from input DataFrame.
Parameter | |
---|---|
Name | Description |
X |
bigframes.dataframe.DataFrame or bigframes.series.Series
Input DataFrame, which needs to contain a column with name "content". Only the column will be used as input. Content can include preamble, questions, suggestions, instructions, or examples. |
Returns | |
---|---|
Type | Description |
bigframes.dataframe.DataFrame | Output DataFrame with only 1 column as the output embedding results |
register
register(vertex_ai_model_id: typing.Optional[str] = None) -> bigframes.ml.base._T
Register the model to Vertex AI.
After register, go to Google Cloud Console (https://console.cloud.google.com/vertex-ai/models) to manage the model registries. Refer to https://cloud.google.com/vertex-ai/docs/model-registry/introduction for more options.
Parameter | |
---|---|
Name | Description |
vertex_ai_model_id |
Optional[str], default None
optional string id as model id in Vertex. If not set, will by default to 'bigframes_{bq_model_id}'. Vertex Ai model id will be truncated to 63 characters due to its limitation. |