Tetap teratur dengan koleksi
Simpan dan kategorikan konten berdasarkan preferensi Anda.
Impor streaming memungkinkan Anda membuat update real-time pada nilai fitur. Metode
ini berguna jika memiliki data terbaru yang tersedia untuk penayangan online
menjadi prioritas. Misalnya, Anda dapat mengimpor data peristiwa streaming dan, dalam beberapa detik, Vertex AI Feature Store (Lama) akan menyediakan data tersebut untuk skenario penyaluran online.
Jika Anda harus mengisi ulang data atau menghitung nilai fitur secara massal, gunakan impor batch. Dibandingkan dengan permintaan impor streaming, permintaan impor
batch dapat menangani payload yang lebih besar, tetapi memerlukan waktu lebih lama untuk menyelesaikan tugas.
Untuk mengetahui informasi tentang stempel waktu nilai fitur terlama yang dapat Anda impor, lihat Vertex AI Feature Store (Lama) di Kuota dan batas.
Anda tidak dapat mengimpor nilai fitur yang stempel waktunya menunjukkan tanggal atau waktu mendatang.
Contoh kasus penggunaan
Organisasi retail online dapat memberikan pengalaman belanja yang dipersonalisasi dengan memanfaatkan aktivitas terkini pengguna. Saat pengguna menjelajahi situs, Anda dapat merekam aktivitas mereka ke dalam featurestore, kemudian menampilkan semua informasi tersebut untuk prediksi online. Impor dan penayangan real-time ini dapat membantu Anda menampilkan rekomendasi yang bermanfaat dan relevan kepada pelanggan selama sesi belanja mereka.
Penggunaan node penyimpanan online
Penulisan nilai fitur ke penyimpanan online menggunakan resource CPU featurestore (node penyimpanan online). Pantau penggunaan CPU Anda untuk memastikan permintaan tidak melebihi pasokan, yang dapat menyebabkan error penyaluran. Kami merekomendasikan tingkat penggunaan maksimal 70% untuk menghindari error ini. Jika nilai tersebut terlampaui secara rutin, Anda dapat mengupdate featurestore untuk meningkatkan jumlah node atau menggunakan penskalaan otomatis. Untuk informasi selengkapnya, lihat Mengelola featurestore.
Impor streaming
Tulis nilai ke sebuah fitur tertentu. Nilai fitur ini harus disertakan sebagai bagian dari permintaan impor. Anda tidak dapat men-streaming data dari sebuah sumber data secara langsung.
Jika Anda menulis ke fitur yang baru dibuat, tunggu beberapa menit sebelum melakukannya karena fitur baru mungkin belum diterapkan. Jika tidak menunggu, Anda mungkin akan melihat error resource not found.
Anda hanya dapat mengimpor nilai fitur untuk satu entity per penulisan. Untuk setiap project dan region tertentu, Anda dapat menulis nilai fitur untuk beberapa entity secara bersamaan dalam maksimum sepuluh jenis entity berbeda. Batas ini mencakup permintaan impor streaming ke semua featurestore di project dan region tertentu. Jika Anda melebihi batas ini, Vertex AI Feature Store (Lama) mungkin tidak menulis semua data Anda ke penyimpanan offline. Jika hal ini terjadi, Vertex AI Feature Store (Lama) akan mencatat error di Logs Explorer. Untuk mengetahui informasi selengkapnya, lihat Memantau error penulisan penyimpanan offline untuk impor streaming.
REST
Untuk mengimpor nilai fitur bagi fitur yang sudah ada, kirim permintaan POST menggunakan metode featurestores.entityTypes.writeFeatureValues. Jika nama kolom data sumber dan ID fitur tujuan berbeda, sertakan parameter sourceField. Perhatikan bahwa featurestores.entityTypes.writeFeatureValues memungkinkan Anda mengimpor nilai fitur hanya untuk satu entity pada satu waktu.
Sebelum menggunakan salah satu data permintaan,
lakukan penggantian berikut:
LOCATION: Region tempat featurestore dibuat. Contoh, us-central1.
TIME_STAMP (opsional): Waktu fitur dibuat. Stempel waktu harus dalam format RFC3339 UTC.
Metode HTTP dan URL:
POST https://LOCATION-aiplatform.googleapis.com/v1/projects/PROJECT/locations/LOCATION/featurestores/FEATURESTORE_ID/entityTypes/ENTITY_TYPE_ID:writeFeatureValues
Anda dapat menginstal dan menggunakan
library klien Vertex AI berikut untuk memanggil
Vertex AI API. Library Klien Cloud memberikan pengalaman yang dioptimalkan bagi developer dengan menggunakan konvensi dan gaya yang natural dari setiap bahasa yang didukung.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-09-02 UTC."],[],[],null,["# Streaming import\n\n| To learn more,\n| run the \"Example Feature Store workflow with sample data\" notebook in one of the following\n| environments:\n|\n| [Open in Colab](https://colab.research.google.com/github/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/official/feature_store_legacy/sdk-feature-store.ipynb)\n|\n|\n| \\|\n|\n| [Open in Colab Enterprise](https://console.cloud.google.com/vertex-ai/colab/import/https%3A%2F%2Fraw.githubusercontent.com%2FGoogleCloudPlatform%2Fvertex-ai-samples%2Fmain%2Fnotebooks%2Fofficial%2Ffeature_store_legacy%2Fsdk-feature-store.ipynb)\n|\n|\n| \\|\n|\n| [Open\n| in Vertex AI Workbench](https://console.cloud.google.com/vertex-ai/workbench/deploy-notebook?download_url=https%3A%2F%2Fraw.githubusercontent.com%2FGoogleCloudPlatform%2Fvertex-ai-samples%2Fmain%2Fnotebooks%2Fofficial%2Ffeature_store_legacy%2Fsdk-feature-store.ipynb)\n|\n|\n| \\|\n|\n| [View on GitHub](https://github.com/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/official/feature_store_legacy/sdk-feature-store.ipynb)\n\nStreaming import lets you make real-time updates to feature values. This\nmethod is useful when having the latest available data for online serving is a\npriority. For example, you can import streaming event data and, within a few\nseconds, Vertex AI Feature Store (Legacy) makes that data available for online\nserving scenarios.\n\nIf you must backfill data or if you compute feature values in batch, use [batch\nimport](/vertex-ai/docs/featurestore/ingesting-batch). Compared to streaming import requests, [batch\nimport](/vertex-ai/docs/featurestore/ingesting-batch) requests can handle larger payloads but\ntake longer to complete.\n\nFor information about the oldest feature value timestamp that you can import,\nsee [Vertex AI Feature Store (Legacy)](/vertex-ai/docs/quotas#featurestore) in [Quotas and limits](/vertex-ai/docs/quotas).\nYou can't import feature values for which the timestamps indicate future dates or times.\n\nExample use case\n----------------\n\nAn online retail organization might provide a personalized shopping experience\nby using the current activity of a user. As users navigate through the website,\nyou can capture their activity into a featurestore and then, soon\nafter, serve all that information for online predictions. This real-time\nimport and serving can help you show useful and relevant recommendations to\ncustomers during their shopping session.\n\nOnline storage node usage\n-------------------------\n\nWriting feature values to an online store uses the featurestore's CPU resources\n(online storage nodes). [Monitor](/vertex-ai/docs/featurestore/monitoring#featurestore) your CPU\nusage to check that demand doesn't exceed supply, which can lead to serving\nerrors. We recommend around a 70% usage rate or lower to avoid these errors. If\nyou regularly exceed that value, you can update your featurestore to increase\nthe number of nodes or use autoscaling. For more information, see [Manage\nfeaturestores](/vertex-ai/docs/featurestore/managing-featurestores).\n\nStreaming import\n----------------\n\nWrite a value to a particular feature. The feature value must be included as\npart of the import request. You can't stream data directly from a data\nsource.\n\nIf you're writing to recently created features, wait a few minutes before you\ndo so because the new features might not have propagated yet. If you don't, you\nmight see a `resource not found` error.\n\nYou can import feature values for only one entity per write. For any specific project and region, you can simultaneously write feature values for multiple entities within a maximum of ten different entity types. This limit includes streaming import requests to all\nfeaturestores in a given project and region. If you exceed this limit,\nVertex AI Feature Store (Legacy) might not write all of your data to the\noffline store. If this occurs, Vertex AI Feature Store (Legacy) logs the error in the **Logs Explorer** . For more information, see [Monitor offline storage write errors for streaming import](/vertex-ai/docs/featurestore/monitoring#monitor_offline_storage_write_errors). \n\n### REST\n\n\nTo import feature values for existing features, send a POST request by using the\n[featurestores.entityTypes.writeFeatureValues](/vertex-ai/docs/reference/rest/v1/projects.locations.featurestores.entityTypes/writeFeatureValues)\nmethod. If the names of the source data columns and the destination feature IDs\nare different, include the `sourceField` parameter. Note that [featurestores.entityTypes.writeFeatureValues](/vertex-ai/docs/reference/rest/v1/projects.locations.featurestores.entityTypes/writeFeatureValues) lets you import feature values for only one entity at a time.\n\n\nBefore using any of the request data,\nmake the following replacements:\n\n- \u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e: Region where the featurestore is created. For example, `us-central1`.\n- \u003cvar translate=\"no\"\u003ePROJECT\u003c/var\u003e: Your [project ID](/resource-manager/docs/creating-managing-projects#identifiers).\n- \u003cvar translate=\"no\"\u003eFEATURESTORE_ID\u003c/var\u003e: ID of the featurestore.\n- \u003cvar translate=\"no\"\u003eENTITY_TYPE_ID\u003c/var\u003e: ID of the entity type.\n- \u003cvar translate=\"no\"\u003eFEATURE_ID\u003c/var\u003e: ID of an existing feature in the featurestore to write values for.\n- \u003cvar translate=\"no\"\u003eVALUE_TYPE\u003c/var\u003e: The [value\n type](/vertex-ai/docs/reference/rest/v1/projects.locations.featurestores.entityTypes.features#ValueType) of the feature.\n- \u003cvar translate=\"no\"\u003eVALUE\u003c/var\u003e: Value for the feature.\n- \u003cvar translate=\"no\"\u003eTIME_STAMP\u003c/var\u003e (optional): The time at which the feature was generated. The timestamp must be in the RFC3339 UTC format.\n\n\nHTTP method and URL:\n\n```\nPOST https://LOCATION-aiplatform.googleapis.com/v1/projects/PROJECT/locations/LOCATION/featurestores/FEATURESTORE_ID/entityTypes/ENTITY_TYPE_ID:writeFeatureValues\n```\n\n\nRequest JSON body:\n\n```\n{\n \"payloads\": [\n {\n \"entityId\": \"ENTITY_ID\",\n \"featureValues\": {\n \"FEATURE_ID\": {\n \"VALUE_TYPE\": VALUE,\n \"metadata\": {\"generate_time\": \"TIME_STAMP\"}\n }\n }\n }\n ]\n}\n```\n\nTo send your request, choose one of these options: \n\n#### curl\n\n| **Note:** The following command assumes that you have logged in to the `gcloud` CLI with your user account by running [`gcloud init`](/sdk/gcloud/reference/init) or [`gcloud auth login`](/sdk/gcloud/reference/auth/login) , or by using [Cloud Shell](/shell/docs), which automatically logs you into the `gcloud` CLI . You can check the currently active account by running [`gcloud auth list`](/sdk/gcloud/reference/auth/list).\n\n\nSave the request body in a file named `request.json`,\nand execute the following command:\n\n```\ncurl -X POST \\\n -H \"Authorization: Bearer $(gcloud auth print-access-token)\" \\\n -H \"Content-Type: application/json; charset=utf-8\" \\\n -d @request.json \\\n \"https://LOCATION-aiplatform.googleapis.com/v1/projects/PROJECT/locations/LOCATION/featurestores/FEATURESTORE_ID/entityTypes/ENTITY_TYPE_ID:writeFeatureValues\"\n```\n\n#### PowerShell\n\n| **Note:** The following command assumes that you have logged in to the `gcloud` CLI with your user account by running [`gcloud init`](/sdk/gcloud/reference/init) or [`gcloud auth login`](/sdk/gcloud/reference/auth/login) . You can check the currently active account by running [`gcloud auth list`](/sdk/gcloud/reference/auth/list).\n\n\nSave the request body in a file named `request.json`,\nand execute the following command:\n\n```\n$cred = gcloud auth print-access-token\n$headers = @{ \"Authorization\" = \"Bearer $cred\" }\n\nInvoke-WebRequest `\n -Method POST `\n -Headers $headers `\n -ContentType: \"application/json; charset=utf-8\" `\n -InFile request.json `\n -Uri \"https://LOCATION-aiplatform.googleapis.com/v1/projects/PROJECT/locations/LOCATION/featurestores/FEATURESTORE_ID/entityTypes/ENTITY_TYPE_ID:writeFeatureValues\" | Select-Object -Expand Content\n```\n\nYou should receive a successful status code (2xx) and an empty response.\n\n### Python\n\nTo learn how to install or update the Vertex AI SDK for Python, see [Install the Vertex AI SDK for Python](/vertex-ai/docs/start/use-vertex-ai-python-sdk).\n\nFor more information, see the\n[Python API reference documentation](/python/docs/reference/aiplatform/latest).\n\n from google.cloud import aiplatform\n\n\n def write_feature_values_sample(\n project: str, location: str, entity_type_id: str, featurestore_id: str\n ):\n\n aiplatform.init(project=project, location=location)\n\n my_entity_type = aiplatform.featurestore.EntityType(\n entity_type_name=entity_type_id, featurestore_id=featurestore_id\n )\n\n my_data = {\n \"movie_01\": {\n \"title\": \"The Shawshank Redemption\",\n \"average_rating\": 4.7,\n \"genre\": \"Drama\",\n },\n }\n\n my_entity_type.write_feature_values(instances=my_data)\n\n### Additional languages\n\nYou can [install](/vertex-ai/docs/start/client-libraries) and use the\nfollowing Vertex AI client libraries to call the\nVertex AI API. Cloud Client Libraries provide an optimized developer\nexperience by using the natural conventions and\nstyles of each supported language.\n\n- [Java](/java/docs/reference/google-cloud-aiplatform/latest/overview)\n- [Node.js](/nodejs/docs/reference/aiplatform/latest)\n\n\u003cbr /\u003e\n\nWhat's next\n-----------\n\n- Learn how to [monitor offline storage write errors for streaming import](/vertex-ai/docs/featurestore/monitoring#monitor_offline_storage_write_errors).\n- Learn how to serve features through [online\n serving](/vertex-ai/docs/featurestore/serving-online) or [batch\n serving](/vertex-ai/docs/featurestore/serving-batch).\n- [Troubleshoot](/vertex-ai/docs/general/troubleshooting#feature-store) common Vertex AI Feature Store (Legacy) issues."]]