Tetap teratur dengan koleksi
Simpan dan kategorikan konten berdasarkan preferensi Anda.
Halaman ini menunjukkan contoh penggunaan Apache Hive dengan layanan Dataproc Metastore. Dalam contoh ini, Anda akan meluncurkan sesi Hive di cluster Dataproc,
lalu menjalankan perintah contoh untuk membuat database dan tabel.
Untuk mulai menggunakan Hive, gunakan SSH untuk terhubung ke cluster Dataproc yang terkait dengan layanan Dataproc Metastore Anda.
Setelah terhubung, Anda dapat menjalankan perintah Hive dari jendela terminal SSH di browser untuk mengelola metadata.
Untuk terhubung ke Hive
Di Google Cloud konsol, buka halaman VM
Instances.
Di daftar instance virtual machine, klik SSH di baris
instance VM Dataproc yang ingin Anda hubungkan.
Jendela browser terbuka di direktori beranda Anda di node dengan output yang mirip dengan berikut ini:
Untuk memulai Hive serta membuat database dan tabel, jalankan perintah berikut
di sesi SSH:
Mulai Hive.
hive
Buat database bernama myDatabase.
create database myDatabase;
Tampilkan database yang Anda buat.
show databases;
Gunakan database yang Anda buat.
use myDatabase;
Buat tabel bernama myTable.
create table myTable(id int,name string);
Mencantumkan tabel di bagian myDatabase.
show tables;
Tampilkan baris tabel dalam tabel yang Anda buat.
desc MyTable;
Menjalankan perintah berikut akan menghasilkan output yang mirip dengan berikut ini:
$hive
hive> show databases;
OK
default
hive> create database myDatabase;
OK
hive> use myDatabase;
OK
hive> create table myTable(id int,name string);
OK
hive> show tables;
OK
myTable
hive> desc myTable;
OK
id int
name string
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-08-27 UTC."],[[["\u003cp\u003eThis guide demonstrates how to use Apache Hive with a Dataproc Metastore service by launching a Hive session on a Dataproc cluster.\u003c/p\u003e\n"],["\u003cp\u003eYou must first create a Dataproc Metastore service and attach it to a Dataproc cluster, including setting the Hive metastore password.\u003c/p\u003e\n"],["\u003cp\u003eConnect to Hive through an SSH connection to the associated Dataproc cluster VM instance.\u003c/p\u003e\n"],["\u003cp\u003eOnce connected, you can run Hive commands to create and manage databases and tables, as demonstrated with the \u003ccode\u003emyDatabase\u003c/code\u003e and \u003ccode\u003emyTable\u003c/code\u003e examples.\u003c/p\u003e\n"],["\u003cp\u003eThe guide also presents examples of how to perform commands such as showing databases and tables, creating databases and tables, and showing descriptions of the tables.\u003c/p\u003e\n"]]],[],null,["# Use Apache Hive with Dataproc Metastore\n\nThis page shows you an example of using Apache Hive with a Dataproc Metastore\nservice. In this example, you launch a Hive session on a Dataproc cluster,\nand then run sample commands to create a database and table.\n\nBefore you begin\n----------------\n\n- Create a [Dataproc Metastore service](/dataproc-metastore/docs/create-service).\n- Attach the [Dataproc Metastore service to a Dataproc cluster](/dataproc-metastore/docs/attach-dataproc).\n - [Set the Hive metastore password](/dataproc/docs/concepts/configuring-clusters/secure-hive-metastore#set_the_hive_metastore_password).\n\nConnect to Apache Hive\n----------------------\n\nTo start using Hive, use SSH to connect to the Dataproc\ncluster that's associated with your Dataproc Metastore service.\nOnce connected, you can run Hive commands from the SSH terminal window\nin your browser to manage your metadata.\n\n**To connect to Hive**\n\n1. In the Google Cloud console, go to the [VM\n Instances](https://console.cloud.google.com/compute/instances) page.\n2. In the list of virtual machine instances, click **SSH** in the row of the Dataproc VM instance that you want to connect to.\n\nA browser window opens in your home directory on the node with an output similar\nto the following: \n\n Connected, host fingerprint: ssh-rsa ...\n Linux cluster-1-m 3.16.0-0.bpo.4-amd64 ...\n ...\n example-cluster@cluster-1-m:~$\n\nTo start Hive and create a database and table, run the following commands\nin the SSH session:\n\n1. Start Hive.\n\n hive\n\n2. Create a database named `myDatabase`.\n\n create database myDatabase;\n\n3. Show the database that you created.\n\n show databases;\n\n4. Use the database that you created.\n\n use myDatabase;\n\n5. Create a table named `myTable`.\n\n create table myTable(id int,name string);\n\n6. List the tables under `myDatabase`.\n\n show tables;\n\n7. Show the table rows in the table that you created.\n\n desc MyTable;\n\nRunning the following commands generates output similar to the following: \n\n $hive\n\n hive\u003e show databases;\n OK\n default\n hive\u003e create database myDatabase;\n OK\n hive\u003e use myDatabase;\n OK\n hive\u003e create table myTable(id int,name string);\n OK\n hive\u003e show tables;\n OK\n myTable\n hive\u003e desc myTable;\n OK\n id int \n name string \n\nWhat's next\n-----------\n\n- [Import metadata](/dataproc-metastore/docs/import-metadata)\n- [Export metadata](/dataproc-metastore/docs/export-metadata)\n- [Use Spark SQL](/dataproc-metastore/docs/use-spark)"]]