Class CodeChatModel (1.95.1)
Stay organized with collections
Save and categorize content based on your preferences.
CodeChatModel(model_id: str, endpoint_name: typing.Optional[str] = None)
CodeChatModel represents a model that is capable of completing code.
.. rubric:: Examples
code_chat_model = CodeChatModel.from_pretrained("codechat-bison@001")
code_chat = code_chat_model.start_chat(
context="I'm writing a large-scale enterprise application.",
max_output_tokens=128,
temperature=0.2,
)
code_chat.send_message("Please help write a function to calculate the min of two numbers")
Methods
CodeChatModel
CodeChatModel(model_id: str, endpoint_name: typing.Optional[str] = None)
Creates a LanguageModel.
This constructor should not be called directly.
Use LanguageModel.from_pretrained(model_name=...)
instead.
from_pretrained
from_pretrained(model_name: str) -> vertexai._model_garden._model_garden_models.T
Loads a _ModelGardenModel.
Exceptions |
Type |
Description |
ValueError |
If model_name is unknown. |
ValueError |
If model does not support this class. |
get_tuned_model
get_tuned_model(
tuned_model_name: str,
) -> vertexai.language_models._language_models._LanguageModel
Loads the specified tuned language model.
list_tuned_model_names
list_tuned_model_names() -> typing.Sequence[str]
Lists the names of tuned models.
start_chat
start_chat(
*,
context: typing.Optional[str] = None,
max_output_tokens: typing.Optional[int] = None,
temperature: typing.Optional[float] = None,
message_history: typing.Optional[
typing.List[vertexai.language_models.ChatMessage]
] = None,
stop_sequences: typing.Optional[typing.List[str]] = None
) -> vertexai.language_models.CodeChatSession
Starts a chat session with the code chat model.
tune_model
tune_model(
training_data: typing.Union[str, pandas.core.frame.DataFrame],
*,
train_steps: typing.Optional[int] = None,
learning_rate_multiplier: typing.Optional[float] = None,
tuning_job_location: typing.Optional[str] = None,
tuned_model_location: typing.Optional[str] = None,
model_display_name: typing.Optional[str] = None,
default_context: typing.Optional[str] = None,
accelerator_type: typing.Optional[typing.Literal["TPU", "GPU"]] = None,
tuning_evaluation_spec: typing.Optional[
vertexai.language_models.TuningEvaluationSpec
] = None
) -> vertexai.language_models._language_models._LanguageModelTuningJob
Tunes a model based on training data.
This method launches and returns an asynchronous model tuning job.
Usage:
tuning_job = model.tune_model(...)
... do some other work
tuned_model = tuning_job.get_tuned_model() # Blocks until tuning is complete
Exceptions |
Type |
Description |
ValueError |
If the "tuning_job_location" value is not supported |
ValueError |
If the "tuned_model_location" value is not supported |
RuntimeError |
If the model does not support tuning |
AttributeError |
If any attribute in the "tuning_evaluation_spec" is not supported |
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-08-07 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[],[],null,["# Class CodeChatModel (1.95.1)\n\nVersion latestkeyboard_arrow_down\n\n- [1.95.1 (latest)](/python/docs/reference/vertexai/latest/vertexai.language_models.CodeChatModel)\n- [1.94.0](/python/docs/reference/vertexai/1.94.0/vertexai.language_models.CodeChatModel)\n- [1.93.1](/python/docs/reference/vertexai/1.93.1/vertexai.language_models.CodeChatModel)\n- [1.92.0](/python/docs/reference/vertexai/1.92.0/vertexai.language_models.CodeChatModel)\n- [1.91.0](/python/docs/reference/vertexai/1.91.0/vertexai.language_models.CodeChatModel)\n- [1.90.0](/python/docs/reference/vertexai/1.90.0/vertexai.language_models.CodeChatModel)\n- [1.89.0](/python/docs/reference/vertexai/1.89.0/vertexai.language_models.CodeChatModel)\n- [1.88.0](/python/docs/reference/vertexai/1.88.0/vertexai.language_models.CodeChatModel)\n- [1.87.0](/python/docs/reference/vertexai/1.87.0/vertexai.language_models.CodeChatModel)\n- [1.86.0](/python/docs/reference/vertexai/1.86.0/vertexai.language_models.CodeChatModel)\n- [1.85.0](/python/docs/reference/vertexai/1.85.0/vertexai.language_models.CodeChatModel)\n- [1.84.0](/python/docs/reference/vertexai/1.84.0/vertexai.language_models.CodeChatModel)\n- [1.83.0](/python/docs/reference/vertexai/1.83.0/vertexai.language_models.CodeChatModel)\n- [1.82.0](/python/docs/reference/vertexai/1.82.0/vertexai.language_models.CodeChatModel)\n- [1.81.0](/python/docs/reference/vertexai/1.81.0/vertexai.language_models.CodeChatModel)\n- [1.80.0](/python/docs/reference/vertexai/1.80.0/vertexai.language_models.CodeChatModel)\n- [1.79.0](/python/docs/reference/vertexai/1.79.0/vertexai.language_models.CodeChatModel)\n- [1.78.0](/python/docs/reference/vertexai/1.78.0/vertexai.language_models.CodeChatModel)\n- [1.77.0](/python/docs/reference/vertexai/1.77.0/vertexai.language_models.CodeChatModel)\n- [1.76.0](/python/docs/reference/vertexai/1.76.0/vertexai.language_models.CodeChatModel)\n- [1.75.0](/python/docs/reference/vertexai/1.75.0/vertexai.language_models.CodeChatModel)\n- [1.74.0](/python/docs/reference/vertexai/1.74.0/vertexai.language_models.CodeChatModel)\n- [1.73.0](/python/docs/reference/vertexai/1.73.0/vertexai.language_models.CodeChatModel)\n- [1.72.0](/python/docs/reference/vertexai/1.72.0/vertexai.language_models.CodeChatModel)\n- [1.71.1](/python/docs/reference/vertexai/1.71.1/vertexai.language_models.CodeChatModel)\n- [1.70.0](/python/docs/reference/vertexai/1.70.0/vertexai.language_models.CodeChatModel)\n- [1.69.0](/python/docs/reference/vertexai/1.69.0/vertexai.language_models.CodeChatModel)\n- [1.68.0](/python/docs/reference/vertexai/1.68.0/vertexai.language_models.CodeChatModel)\n- [1.67.1](/python/docs/reference/vertexai/1.67.1/vertexai.language_models.CodeChatModel)\n- [1.66.0](/python/docs/reference/vertexai/1.66.0/vertexai.language_models.CodeChatModel)\n- [1.65.0](/python/docs/reference/vertexai/1.65.0/vertexai.language_models.CodeChatModel)\n- [1.63.0](/python/docs/reference/vertexai/1.63.0/vertexai.language_models.CodeChatModel)\n- [1.62.0](/python/docs/reference/vertexai/1.62.0/vertexai.language_models.CodeChatModel)\n- [1.60.0](/python/docs/reference/vertexai/1.60.0/vertexai.language_models.CodeChatModel)\n- [1.59.0](/python/docs/reference/vertexai/1.59.0/vertexai.language_models.CodeChatModel) \n\n CodeChatModel(model_id: str, endpoint_name: typing.Optional[str] = None)\n\nCodeChatModel represents a model that is capable of completing code.\n\n.. rubric:: Examples\n\ncode_chat_model = CodeChatModel.from_pretrained(\"codechat-bison@001\")\n\ncode_chat = code_chat_model.start_chat(\ncontext=\"I'm writing a large-scale enterprise application.\",\nmax_output_tokens=128,\ntemperature=0.2,\n)\n\ncode_chat.send_message(\"Please help write a function to calculate the min of two numbers\")\n\nMethods\n-------\n\n### CodeChatModel\n\n CodeChatModel(model_id: str, endpoint_name: typing.Optional[str] = None)\n\nCreates a LanguageModel.\n\nThis constructor should not be called directly.\nUse `LanguageModel.from_pretrained(model_name=...)` instead.\n\n### from_pretrained\n\n from_pretrained(model_name: str) -\u003e vertexai._model_garden._model_garden_models.T\n\nLoads a _ModelGardenModel.\n\n### get_tuned_model\n\n get_tuned_model(\n tuned_model_name: str,\n ) -\u003e vertexai.language_models._language_models._LanguageModel\n\nLoads the specified tuned language model.\n\n### list_tuned_model_names\n\n list_tuned_model_names() -\u003e typing.Sequence[str]\n\nLists the names of tuned models.\n\n### start_chat\n\n start_chat(\n *,\n context: typing.Optional[str] = None,\n max_output_tokens: typing.Optional[int] = None,\n temperature: typing.Optional[float] = None,\n message_history: typing.Optional[\n typing.List[vertexai.language_models.ChatMessage]\n ] = None,\n stop_sequences: typing.Optional[typing.List[str]] = None\n ) -\u003e vertexai.language_models.CodeChatSession\n\nStarts a chat session with the code chat model.\n\n### tune_model\n\n tune_model(\n training_data: typing.Union[str, pandas.core.frame.DataFrame],\n *,\n train_steps: typing.Optional[int] = None,\n learning_rate_multiplier: typing.Optional[float] = None,\n tuning_job_location: typing.Optional[str] = None,\n tuned_model_location: typing.Optional[str] = None,\n model_display_name: typing.Optional[str] = None,\n default_context: typing.Optional[str] = None,\n accelerator_type: typing.Optional[typing.Literal[\"TPU\", \"GPU\"]] = None,\n tuning_evaluation_spec: typing.Optional[\n vertexai.language_models.TuningEvaluationSpec\n ] = None\n ) -\u003e vertexai.language_models._language_models._LanguageModelTuningJob\n\nTunes a model based on training data.\n\nThis method launches and returns an asynchronous model tuning job.\nUsage: \n\n tuning_job = model.tune_model(...)\n ... do some other work\n tuned_model = tuning_job.get_tuned_model() # Blocks until tuning is complete"]]