Zur Cloud-Clientbibliothek des Storage Transfer Service migrieren

Um eine hohe Qualität zu gewährleisten und mit unseren anderen Cloud-Bibliotheken übereinzustimmen, werden in der Dokumentation für den Storage Transfer Service jetzt die Cloud-Clientbibliotheken anstelle der Google API-Clientbibliotheken verwendet. Weitere Informationen zu den beiden Optionen finden Sie unter Erläuterung zu Clientbibliotheken.

Die Google API-Clientbibliothek wird weiterhin aktualisiert, wird aber in der Dokumentation nicht mehr erwähnt.

In diesem Leitfaden werden die Hauptunterschiede bei der Verwendung des Storage Transfer Service beschrieben. Außerdem finden Sie eine Anleitung zum Aktualisieren Ihrer Clients bei der Migration zur Cloud-Clientbibliothek.

Java

Abhängigkeiten aktualisieren

Wenn Sie zur neuen Bibliothek wechseln möchten, ersetzen Sie die Abhängigkeit von google-api-services-storagetransfer durch google-cloud-storage-transfer.

<dependency>
    <groupId>com.google.cloud</groupId>
    <artifactId>google-cloud-storage-transfer</artifactId>
    <version>0.2.3</version>
</dependency>

Wenn Sie Gradle ohne BOM verwenden, fügen Sie Ihren Abhängigkeiten Folgendes hinzu:

implementation 'com.google.cloud:google-cloud-storage-transfer:0.2.3'
<dependencyManagement>
  <dependencies>
    <dependency>
      <groupId>com.google.cloud</groupId>
      <artifactId>libraries-bom</artifactId>
      <version>24.1.0</version>
      <type>pom</type>
      <scope>import</scope>
    </dependency>
  </dependencies>
</dependencyManagement>

<dependencies>
  <dependency>
    <groupId>com.google.cloud</groupId>
    <artifactId>google-cloud-storage-transfer</artifactId>
  </dependency>

In den meisten Fällen kann Code ziemlich einfach von der API-Clientbibliothek in die Cloud-Clientbibliothek konvertiert werden. Im Folgenden sind einige wichtige Unterschiede zwischen den beiden Java-Clients aufgeführt.

Clientinstanziierung

Die Cloud-Clientbibliothek reduziert den Boilerplate-Code, der mit der Clientinstanzierung verbunden ist, da sie im Hintergrund ausgeführt wird.

API-Clientbibliothek

GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
if (credential.createScopedRequired()) {
 credential = credential.createScoped(StoragetransferScopes.all());
}
Storagetransfer storageTransfer = new Storagetransfer.Builder(Utils.getDefaultTransport(),
   Utils.getDefaultJsonFactory(), new HttpCredentialsAdapter(credential))
   .build();

Cloud-Clientbibliothek

StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();

Ersteller für Modellklassen

Für Modellklassen in der Cloud-Clientbibliothek werden Erbauer anstelle von Konstruktoren verwendet.

API-Clientbibliothek

TransferJob transferJob =
   new TransferJob()
       .setStatus("ENABLED");

Cloud-Clientbibliothek

TransferJob transferJob =
   TransferJob.newBuilder()
       .setStatus(Status.ENABLED)
   .build();

Listenvorgänge geben Iterables zurück

Listenvorgänge in der Cloud-Clientbibliothek geben einfache Iterables zurück, anstelle der paginaten Ergebnisse in der API-Clientbibliothek.

API-Clientbibliothek

public class StoragetransferExample {
  public static void main(String args[]) throws IOException, GeneralSecurityException {
    Storagetransfer storagetransferService = createStoragetransferService();
    Storagetransfer.TransferJobs.List request = storagetransferService.transferJobs().list();

    ListTransferJobsResponse response;
    do {
      response = request.execute();
      if (response.getTransferJobs() == null) {
        continue;
      }
      for (TransferJob transferJob : response.getTransferJobs()) {
        System.out.println(transferJob);
      }
      request.setPageToken(response.getNextPageToken());
    } while (response.getNextPageToken() != null);
  }

  public static Storagetransfer createStoragetransferService()
      throws IOException, GeneralSecurityException {
    HttpTransport httpTransport = GoogleNetHttpTransport.newTrustedTransport();
    JsonFactory jsonFactory = JacksonFactory.getDefaultInstance();

    GoogleCredential credential = GoogleCredential.getApplicationDefault();
    }

    return new Storagetransfer.Builder(httpTransport, jsonFactory, credential)
        .build();
  }
}

Cloud-Clientbibliothek

public class StoragetransferExample {
 public static void main(String args[]) throws Exception {
   StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();
   ListTransferJobsRequest request = ListTransferJobsRequest.newBuilder().build();
   for (TransferJob job : client.listTransferJobs(request).iterateAll()) {
     System.out.println(job);
   }
 }
}

Vergleichsbeispiele

Hier sehen Sie die Beispiele für die alte API-Clientbibliothek im Vergleich zu den entsprechenden Beispielen mit der Cloud-Clientbibliothek. Wenn Sie diese Beispiele bereits verwendet haben, können Sie anhand dieses Vergleichs nachvollziehen, wie Sie Ihren Code in die neue Cloud-Clientbibliothek verschieben.

Von Amazon S3 übertragen

API-Clientbibliothek


import com.google.api.client.googleapis.util.Utils;
import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.StoragetransferScopes;
import com.google.api.services.storagetransfer.v1.model.AwsAccessKey;
import com.google.api.services.storagetransfer.v1.model.AwsS3Data;
import com.google.api.services.storagetransfer.v1.model.Date;
import com.google.api.services.storagetransfer.v1.model.GcsData;
import com.google.api.services.storagetransfer.v1.model.Schedule;
import com.google.api.services.storagetransfer.v1.model.TimeOfDay;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.api.services.storagetransfer.v1.model.TransferSpec;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;
import java.util.Calendar;

public class TransferFromAwsApiary {

  // Creates a one-off transfer job from Amazon S3 to Google Cloud Storage.
  public static void transferFromAws(
      String projectId,
      String jobDescription,
      String awsSourceBucket,
      String gcsSinkBucket,
      long startDateTime)
      throws IOException {

    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // A short description of this job
    // String jobDescription = "Sample transfer job from S3 to GCS.";

    // The name of the source AWS bucket to transfer data from
    // String awsSourceBucket = "yourAwsSourceBucket";

    // The name of the GCS bucket to transfer data to
    // String gcsSinkBucket = "your-gcs-bucket";

    // What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
    // If this is in the past relative to when the job is created, it will run the next day.
    // long startDateTime =
    //     new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();

    // The ID used to access your AWS account. Should be accessed via environment variable.
    String awsAccessKeyId = System.getenv("AWS_ACCESS_KEY_ID");

    // The Secret Key used to access your AWS account. Should be accessed via environment variable.
    String awsSecretAccessKey = System.getenv("AWS_SECRET_ACCESS_KEY");

    // Set up source and sink
    TransferSpec transferSpec =
        new TransferSpec()
            .setAwsS3DataSource(
                new AwsS3Data()
                    .setBucketName(awsSourceBucket)
                    .setAwsAccessKey(
                        new AwsAccessKey()
                            .setAccessKeyId(awsAccessKeyId)
                            .setSecretAccessKey(awsSecretAccessKey)))
            .setGcsDataSink(new GcsData().setBucketName(gcsSinkBucket));

    // Parse epoch timestamp into the model classes
    Calendar startCalendar = Calendar.getInstance();
    startCalendar.setTimeInMillis(startDateTime);
    // Note that this is a Date from the model class package, not a java.util.Date
    Date startDate =
        new Date()
            .setYear(startCalendar.get(Calendar.YEAR))
            .setMonth(startCalendar.get(Calendar.MONTH) + 1)
            .setDay(startCalendar.get(Calendar.DAY_OF_MONTH));
    TimeOfDay startTime =
        new TimeOfDay()
            .setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
            .setMinutes(startCalendar.get(Calendar.MINUTE))
            .setSeconds(startCalendar.get(Calendar.SECOND));
    Schedule schedule =
        new Schedule()
            .setScheduleStartDate(startDate)
            .setScheduleEndDate(startDate)
            .setStartTimeOfDay(startTime);

    // Set up the transfer job
    TransferJob transferJob =
        new TransferJob()
            .setDescription(jobDescription)
            .setProjectId(projectId)
            .setTransferSpec(transferSpec)
            .setSchedule(schedule)
            .setStatus("ENABLED");

    // Create a Transfer Service client
    GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
    if (credential.createScopedRequired()) {
      credential = credential.createScoped(StoragetransferScopes.all());
    }
    Storagetransfer storageTransfer =
        new Storagetransfer.Builder(
                Utils.getDefaultTransport(),
                Utils.getDefaultJsonFactory(),
                new HttpCredentialsAdapter(credential))
            .build();

    // Create the transfer job
    TransferJob response = storageTransfer.transferJobs().create(transferJob).execute();

    System.out.println("Created transfer job from AWS to GCS:");
    System.out.println(response.toPrettyString());
  }
}

Cloud-Clientbibliothek


import com.google.storagetransfer.v1.proto.StorageTransferServiceClient;
import com.google.storagetransfer.v1.proto.TransferProto.CreateTransferJobRequest;
import com.google.storagetransfer.v1.proto.TransferTypes.AwsAccessKey;
import com.google.storagetransfer.v1.proto.TransferTypes.AwsS3Data;
import com.google.storagetransfer.v1.proto.TransferTypes.GcsData;
import com.google.storagetransfer.v1.proto.TransferTypes.Schedule;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob.Status;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferSpec;
import com.google.type.Date;
import com.google.type.TimeOfDay;
import java.io.IOException;
import java.util.Calendar;

public class TransferFromAws {

  // Creates a one-off transfer job from Amazon S3 to Google Cloud Storage.
  public static void transferFromAws(
      String projectId,
      String jobDescription,
      String awsSourceBucket,
      String gcsSinkBucket,
      long startDateTime)
      throws IOException {

    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // A short description of this job
    // String jobDescription = "Sample transfer job from S3 to GCS.";

    // The name of the source AWS bucket to transfer data from
    // String awsSourceBucket = "yourAwsSourceBucket";

    // The name of the GCS bucket to transfer data to
    // String gcsSinkBucket = "your-gcs-bucket";

    // What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
    // If this is in the past relative to when the job is created, it will run the next day.
    // long startDateTime =
    //     new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();

    // The ID used to access your AWS account. Should be accessed via environment variable.
    String awsAccessKeyId = System.getenv("AWS_ACCESS_KEY_ID");

    // The Secret Key used to access your AWS account. Should be accessed via environment variable.
    String awsSecretAccessKey = System.getenv("AWS_SECRET_ACCESS_KEY");

    // Set up source and sink
    TransferSpec transferSpec =
        TransferSpec.newBuilder()
            .setAwsS3DataSource(
                AwsS3Data.newBuilder()
                    .setBucketName(awsSourceBucket)
                    .setAwsAccessKey(
                        AwsAccessKey.newBuilder()
                            .setAccessKeyId(awsAccessKeyId)
                            .setSecretAccessKey(awsSecretAccessKey)))
            .setGcsDataSink(GcsData.newBuilder().setBucketName(gcsSinkBucket))
            .build();

    // Parse epoch timestamp into the model classes
    Calendar startCalendar = Calendar.getInstance();
    startCalendar.setTimeInMillis(startDateTime);
    // Note that this is a Date from the model class package, not a java.util.Date
    Date startDate =
        Date.newBuilder()
            .setYear(startCalendar.get(Calendar.YEAR))
            .setMonth(startCalendar.get(Calendar.MONTH) + 1)
            .setDay(startCalendar.get(Calendar.DAY_OF_MONTH))
            .build();
    TimeOfDay startTime =
        TimeOfDay.newBuilder()
            .setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
            .setMinutes(startCalendar.get(Calendar.MINUTE))
            .setSeconds(startCalendar.get(Calendar.SECOND))
            .build();
    Schedule schedule =
        Schedule.newBuilder()
            .setScheduleStartDate(startDate)
            .setScheduleEndDate(startDate)
            .setStartTimeOfDay(startTime)
            .build();

    // Set up the transfer job
    TransferJob transferJob =
        TransferJob.newBuilder()
            .setDescription(jobDescription)
            .setProjectId(projectId)
            .setTransferSpec(transferSpec)
            .setSchedule(schedule)
            .setStatus(Status.ENABLED)
            .build();

    // Create a Transfer Service client
    StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();

    // Create the transfer job
    TransferJob response =
        storageTransfer.createTransferJob(
            CreateTransferJobRequest.newBuilder().setTransferJob(transferJob).build());

    System.out.println("Created transfer job from AWS to GCS:");
    System.out.println(response.toString());
  }
}

In Nearline-Speicher übertragen

API-Clientbibliothek

import com.google.api.client.googleapis.util.Utils;
import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.StoragetransferScopes;
import com.google.api.services.storagetransfer.v1.model.Date;
import com.google.api.services.storagetransfer.v1.model.GcsData;
import com.google.api.services.storagetransfer.v1.model.ObjectConditions;
import com.google.api.services.storagetransfer.v1.model.Schedule;
import com.google.api.services.storagetransfer.v1.model.TimeOfDay;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.api.services.storagetransfer.v1.model.TransferOptions;
import com.google.api.services.storagetransfer.v1.model.TransferSpec;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;
import java.util.Calendar;

public class TransferToNearlineApiary {
  /**
   * Creates a one-off transfer job that transfers objects in a standard GCS bucket that are more
   * than 30 days old to a Nearline GCS bucket.
   */
  public static void transferToNearlineApiary(
      String projectId,
      String jobDescription,
      String gcsSourceBucket,
      String gcsNearlineSinkBucket,
      long startDateTime)
      throws IOException {

    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // A short description of this job
    // String jobDescription = "Sample transfer job of old objects to a Nearline GCS bucket.";

    // The name of the source GCS bucket to transfer data from
    // String gcsSourceBucket = "your-gcs-source-bucket";

    // The name of the Nearline GCS bucket to transfer old objects to
    // String gcsSinkBucket = "your-nearline-gcs-bucket";

    // What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
    // If this is in the past relative to when the job is created, it will run the next day.
    // long startDateTime =
    //     new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();

    // Parse epoch timestamp into the model classes
    Calendar startCalendar = Calendar.getInstance();
    startCalendar.setTimeInMillis(startDateTime);
    // Note that this is a Date from the model class package, not a java.util.Date
    Date date =
        new Date()
            .setYear(startCalendar.get(Calendar.YEAR))
            .setMonth(startCalendar.get(Calendar.MONTH) + 1)
            .setDay(startCalendar.get(Calendar.DAY_OF_MONTH));
    TimeOfDay time =
        new TimeOfDay()
            .setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
            .setMinutes(startCalendar.get(Calendar.MINUTE))
            .setSeconds(startCalendar.get(Calendar.SECOND));

    TransferJob transferJob =
        new TransferJob()
            .setDescription(jobDescription)
            .setProjectId(projectId)
            .setTransferSpec(
                new TransferSpec()
                    .setGcsDataSource(new GcsData().setBucketName(gcsSourceBucket))
                    .setGcsDataSink(new GcsData().setBucketName(gcsNearlineSinkBucket))
                    .setObjectConditions(
                        new ObjectConditions()
                            .setMinTimeElapsedSinceLastModification("2592000s" /* 30 days */))
                    .setTransferOptions(
                        new TransferOptions().setDeleteObjectsFromSourceAfterTransfer(true)))
            .setSchedule(new Schedule().setScheduleStartDate(date).setStartTimeOfDay(time))
            .setStatus("ENABLED");

    // Create a Transfer Service client
    GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
    if (credential.createScopedRequired()) {
      credential = credential.createScoped(StoragetransferScopes.all());
    }
    Storagetransfer storageTransfer =
        new Storagetransfer.Builder(
                Utils.getDefaultTransport(),
                Utils.getDefaultJsonFactory(),
                new HttpCredentialsAdapter(credential))
            .build();

    // Create the transfer job
    TransferJob response = storageTransfer.transferJobs().create(transferJob).execute();

    System.out.println("Created transfer job from standard bucket to Nearline bucket:");
    System.out.println(response.toPrettyString());
  }
}

Cloud-Clientbibliothek

import com.google.protobuf.Duration;
import com.google.storagetransfer.v1.proto.StorageTransferServiceClient;
import com.google.storagetransfer.v1.proto.TransferProto.CreateTransferJobRequest;
import com.google.storagetransfer.v1.proto.TransferTypes.GcsData;
import com.google.storagetransfer.v1.proto.TransferTypes.ObjectConditions;
import com.google.storagetransfer.v1.proto.TransferTypes.Schedule;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob.Status;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferOptions;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferSpec;
import com.google.type.Date;
import com.google.type.TimeOfDay;
import java.io.IOException;
import java.util.Calendar;

public class TransferToNearline {
  /**
   * Creates a one-off transfer job that transfers objects in a standard GCS bucket that are more
   * than 30 days old to a Nearline GCS bucket.
   */
  public static void transferToNearline(
      String projectId,
      String jobDescription,
      String gcsSourceBucket,
      String gcsNearlineSinkBucket,
      long startDateTime)
      throws IOException {

    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // A short description of this job
    // String jobDescription = "Sample transfer job of old objects to a Nearline GCS bucket.";

    // The name of the source GCS bucket to transfer data from
    // String gcsSourceBucket = "your-gcs-source-bucket";

    // The name of the Nearline GCS bucket to transfer old objects to
    // String gcsSinkBucket = "your-nearline-gcs-bucket";

    // What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
    // If this is in the past relative to when the job is created, it will run the next day.
    // long startDateTime =
    //     new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();

    // Parse epoch timestamp into the model classes
    Calendar startCalendar = Calendar.getInstance();
    startCalendar.setTimeInMillis(startDateTime);
    // Note that this is a Date from the model class package, not a java.util.Date
    Date date =
        Date.newBuilder()
            .setYear(startCalendar.get(Calendar.YEAR))
            .setMonth(startCalendar.get(Calendar.MONTH) + 1)
            .setDay(startCalendar.get(Calendar.DAY_OF_MONTH))
            .build();
    TimeOfDay time =
        TimeOfDay.newBuilder()
            .setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
            .setMinutes(startCalendar.get(Calendar.MINUTE))
            .setSeconds(startCalendar.get(Calendar.SECOND))
            .build();

    TransferJob transferJob =
        TransferJob.newBuilder()
            .setDescription(jobDescription)
            .setProjectId(projectId)
            .setTransferSpec(
                TransferSpec.newBuilder()
                    .setGcsDataSource(GcsData.newBuilder().setBucketName(gcsSourceBucket))
                    .setGcsDataSink(GcsData.newBuilder().setBucketName(gcsNearlineSinkBucket))
                    .setObjectConditions(
                        ObjectConditions.newBuilder()
                            .setMinTimeElapsedSinceLastModification(
                                Duration.newBuilder().setSeconds(2592000 /* 30 days */)))
                    .setTransferOptions(
                        TransferOptions.newBuilder().setDeleteObjectsFromSourceAfterTransfer(true)))
            .setSchedule(Schedule.newBuilder().setScheduleStartDate(date).setStartTimeOfDay(time))
            .setStatus(Status.ENABLED)
            .build();

    // Create a Transfer Service client
    StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();

    // Create the transfer job
    TransferJob response =
        storageTransfer.createTransferJob(
            CreateTransferJobRequest.newBuilder().setTransferJob(transferJob).build());

    System.out.println("Created transfer job from standard bucket to Nearline bucket:");
    System.out.println(response.toString());
  }
}

Letzten Übertragungsvorgang prüfen

API-Clientbibliothek


import com.google.api.client.googleapis.util.Utils;
import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.StoragetransferScopes;
import com.google.api.services.storagetransfer.v1.model.Operation;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;

public class CheckLatestTransferOperationApiary {

  // Gets the requested transfer job and checks its latest operation
  public static void checkLatestTransferOperationApiary(String projectId, String jobName)
      throws IOException {
    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // The name of the job to check
    // String jobName = "myJob/1234567890";

    // Create Storage Transfer client
    GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
    if (credential.createScopedRequired()) {
      credential = credential.createScoped(StoragetransferScopes.all());
    }
    Storagetransfer storageTransfer =
        new Storagetransfer.Builder(
                Utils.getDefaultTransport(),
                Utils.getDefaultJsonFactory(),
                new HttpCredentialsAdapter(credential))
            .build();

    // Get transfer job and check latest operation
    TransferJob transferJob = storageTransfer.transferJobs().get(jobName, projectId).execute();
    String latestOperationName = transferJob.getLatestOperationName();

    if (latestOperationName != null) {
      Operation latestOperation =
          storageTransfer.transferOperations().get(latestOperationName).execute();
      System.out.println("The latest operation for transfer job " + jobName + " is:");
      System.out.println(latestOperation.toPrettyString());

    } else {
      System.out.println(
          "Transfer job "
              + jobName
              + " does not have an operation scheduled yet,"
              + " try again once the job starts running.");
    }
  }
}

Cloud-Clientbibliothek


import com.google.longrunning.Operation;
import com.google.storagetransfer.v1.proto.StorageTransferServiceClient;
import com.google.storagetransfer.v1.proto.TransferProto.GetTransferJobRequest;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferOperation;
import java.io.IOException;

public class CheckLatestTransferOperation {

  // Gets the requested transfer job and checks its latest operation
  public static void checkLatestTransferOperation(String projectId, String jobName)
      throws IOException {
    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // The name of the job to check
    // String jobName = "myJob/1234567890";

    StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();

    // Get transfer job and check latest operation
    TransferJob transferJob =
        storageTransfer.getTransferJob(
            GetTransferJobRequest.newBuilder().setJobName(jobName).setProjectId(projectId).build());
    String latestOperationName = transferJob.getLatestOperationName();

    if (!latestOperationName.isEmpty()) {
      Operation operation = storageTransfer.getOperationsClient().getOperation(latestOperationName);
      TransferOperation latestOperation =
          TransferOperation.parseFrom(operation.getMetadata().getValue());

      System.out.println("The latest operation for transfer job " + jobName + " is:");
      System.out.println(latestOperation.toString());

    } else {
      System.out.println(
          "Transfer job "
              + jobName
              + " hasn't run yet,"
              + " try again once the job starts running.");
    }
  }
}

Python

Abhängigkeiten aktualisieren

Wenn Sie die neue Bibliothek verwenden möchten, fügen Sie eine Abhängigkeit von google-cloud-storage-transfer hinzu. Dieser wird anstelle des Discovery-Clients von google-api-python-client verwendet.

pip install --upgrade google-cloud-storage-transfer

Clientinstanziierung

Verwenden Sie das storage_transfer-Modul anstelle von googleapiclient.discovery.

API-Clientbibliothek

"""A sample for creating a Storage Transfer Service client."""

import googleapiclient.discovery


def create_transfer_client():
    return googleapiclient.discovery.build("storagetransfer", "v1")

Cloud-Clientbibliothek

"""A sample for creating a Storage Transfer Service client."""

from google.cloud import storage_transfer


def create_transfer_client():
    return storage_transfer.StorageTransferServiceClient()

Vergleichsbeispiele

Zur Veranschaulichung der Unterschiede zwischen den beiden Bibliotheken sind hier die alten API-Client-Beispiele neben den entsprechenden Beispielen in der Cloud-Clientbibliothek zu sehen.

Übertragung von Amazon S3

API-Clientbibliothek

def main(
    description,
    project_id,
    start_date,
    start_time,
    source_bucket,
    access_key_id,
    secret_access_key,
    sink_bucket,
):
    """Create a one-time transfer from Amazon S3 to Google Cloud Storage."""
    storagetransfer = googleapiclient.discovery.build("storagetransfer", "v1")

    # Edit this template with desired parameters.
    transfer_job = {
        "description": description,
        "status": "ENABLED",
        "projectId": project_id,
        "schedule": {
            "scheduleStartDate": {
                "day": start_date.day,
                "month": start_date.month,
                "year": start_date.year,
            },
            "scheduleEndDate": {
                "day": start_date.day,
                "month": start_date.month,
                "year": start_date.year,
            },
            "startTimeOfDay": {
                "hours": start_time.hour,
                "minutes": start_time.minute,
                "seconds": start_time.second,
            },
        },
        "transferSpec": {
            "awsS3DataSource": {
                "bucketName": source_bucket,
                "awsAccessKey": {
                    "accessKeyId": access_key_id,
                    "secretAccessKey": secret_access_key,
                },
            },
            "gcsDataSink": {"bucketName": sink_bucket},
        },
    }

    result = storagetransfer.transferJobs().create(body=transfer_job).execute()
    print("Returned transferJob: {}".format(json.dumps(result, indent=4)))

Cloud-Clientbibliothek

from datetime import datetime

from google.cloud import storage_transfer


def create_one_time_aws_transfer(
    project_id: str,
    description: str,
    source_bucket: str,
    aws_access_key_id: str,
    aws_secret_access_key: str,
    sink_bucket: str,
):
    """Creates a one-time transfer job from Amazon S3 to Google Cloud
    Storage."""

    client = storage_transfer.StorageTransferServiceClient()

    # The ID of the Google Cloud Platform Project that owns the job
    # project_id = 'my-project-id'

    # A useful description for your transfer job
    # description = 'My transfer job'

    # AWS S3 source bucket name
    # source_bucket = 'my-s3-source-bucket'

    # AWS Access Key ID
    # aws_access_key_id = 'AKIA...'

    # AWS Secret Access Key
    # aws_secret_access_key = 'HEAoMK2.../...ku8'

    # Google Cloud Storage destination bucket name
    # sink_bucket = 'my-gcs-destination-bucket'

    now = datetime.utcnow()
    # Setting the start date and the end date as
    # the same time creates a one-time transfer
    one_time_schedule = {"day": now.day, "month": now.month, "year": now.year}

    transfer_job_request = storage_transfer.CreateTransferJobRequest(
        {
            "transfer_job": {
                "project_id": project_id,
                "description": description,
                "status": storage_transfer.TransferJob.Status.ENABLED,
                "schedule": {
                    "schedule_start_date": one_time_schedule,
                    "schedule_end_date": one_time_schedule,
                },
                "transfer_spec": {
                    "aws_s3_data_source": {
                        "bucket_name": source_bucket,
                        "aws_access_key": {
                            "access_key_id": aws_access_key_id,
                            "secret_access_key": aws_secret_access_key,
                        },
                    },
                    "gcs_data_sink": {
                        "bucket_name": sink_bucket,
                    },
                },
            }
        }
    )

    result = client.create_transfer_job(transfer_job_request)
    print(f"Created transferJob: {result.name}")

In Nearline-Speicher übertragen

API-Clientbibliothek

def main(description, project_id, start_date, start_time, source_bucket, sink_bucket):
    """Create a daily transfer from Standard to Nearline Storage class."""
    storagetransfer = googleapiclient.discovery.build("storagetransfer", "v1")

    # Edit this template with desired parameters.
    transfer_job = {
        "description": description,
        "status": "ENABLED",
        "projectId": project_id,
        "schedule": {
            "scheduleStartDate": {
                "day": start_date.day,
                "month": start_date.month,
                "year": start_date.year,
            },
            "startTimeOfDay": {
                "hours": start_time.hour,
                "minutes": start_time.minute,
                "seconds": start_time.second,
            },
        },
        "transferSpec": {
            "gcsDataSource": {"bucketName": source_bucket},
            "gcsDataSink": {"bucketName": sink_bucket},
            "objectConditions": {
                "minTimeElapsedSinceLastModification": "2592000s"  # 30 days
            },
            "transferOptions": {"deleteObjectsFromSourceAfterTransfer": "true"},
        },
    }

    result = storagetransfer.transferJobs().create(body=transfer_job).execute()
    print("Returned transferJob: {}".format(json.dumps(result, indent=4)))

Cloud-Clientbibliothek

Notieren Sie sich den Import von google.protobuf.duration_pb2.Duration.

from datetime import datetime

from google.cloud import storage_transfer
from google.protobuf.duration_pb2 import Duration


def create_daily_nearline_30_day_migration(
    project_id: str,
    description: str,
    source_bucket: str,
    sink_bucket: str,
    start_date: datetime,
):
    """Create a daily migration from a GCS bucket to a Nearline GCS bucket
    for objects untouched for 30 days."""

    client = storage_transfer.StorageTransferServiceClient()

    # The ID of the Google Cloud Platform Project that owns the job
    # project_id = 'my-project-id'

    # A useful description for your transfer job
    # description = 'My transfer job'

    # Google Cloud Storage source bucket name
    # source_bucket = 'my-gcs-source-bucket'

    # Google Cloud Storage destination bucket name
    # sink_bucket = 'my-gcs-destination-bucket'

    transfer_job_request = storage_transfer.CreateTransferJobRequest(
        {
            "transfer_job": {
                "project_id": project_id,
                "description": description,
                "status": storage_transfer.TransferJob.Status.ENABLED,
                "schedule": {
                    "schedule_start_date": {
                        "day": start_date.day,
                        "month": start_date.month,
                        "year": start_date.year,
                    }
                },
                "transfer_spec": {
                    "gcs_data_source": {
                        "bucket_name": source_bucket,
                    },
                    "gcs_data_sink": {
                        "bucket_name": sink_bucket,
                    },
                    "object_conditions": {
                        "min_time_elapsed_since_last_modification": Duration(
                            seconds=2592000  # 30 days
                        )
                    },
                    "transfer_options": {
                        "delete_objects_from_source_after_transfer": True
                    },
                },
            }
        }
    )

    result = client.create_transfer_job(transfer_job_request)
    print(f"Created transferJob: {result.name}")

Letzten Übertragungsvorgang prüfen

API-Clientbibliothek


"""Command-line sample that checks the latest operation of a transfer.
This sample is used on this page:
    https://cloud.google.com/storage/transfer/create-transfer
For more information, see README.md.
"""

import argparse
import json

import googleapiclient.discovery


def check_latest_transfer_operation(project_id, job_name):
    """Check the latest transfer operation associated with a transfer job."""
    storagetransfer = googleapiclient.discovery.build("storagetransfer", "v1")

    transferJob = (
        storagetransfer.transferJobs()
        .get(projectId=project_id, jobName=job_name)
        .execute()
    )
    latestOperationName = transferJob.get("latestOperationName")

    if latestOperationName:
        result = (
            storagetransfer.transferOperations().get(name=latestOperationName).execute()
        )
        print(
            "The latest operation for job"
            + job_name
            + " is: {}".format(json.dumps(result, indent=4, sort_keys=True))
        )

    else:
        print(
            "Transfer job "
            + job_name
            + " does not have an operation scheduled yet, "
            + "try again once the job starts running."
        )


if __name__ == "__main__":
    parser = argparse.ArgumentParser(
        description=__doc__, formatter_class=argparse.RawDescriptionHelpFormatter
    )
    parser.add_argument("project_id", help="Your Google Cloud project ID.")
    parser.add_argument("job_name", help="Your job name.")

    args = parser.parse_args()

    check_latest_transfer_operation(args.project_id, args.job_name)

Cloud-Clientbibliothek

Beachten Sie die Verwendung von storage_transfer.TransferOperation.deserialize.

from google.cloud import storage_transfer


def check_latest_transfer_operation(project_id: str, job_name: str):
    """Checks the latest transfer operation for a given transfer job."""

    client = storage_transfer.StorageTransferServiceClient()

    # The ID of the Google Cloud Platform Project that owns the job
    # project_id = 'my-project-id'

    # Storage Transfer Service job name
    # job_name = 'transferJobs/1234567890'

    transfer_job = client.get_transfer_job(
        {
            "project_id": project_id,
            "job_name": job_name,
        }
    )

    if transfer_job.latest_operation_name:
        response = client.transport.operations_client.get_operation(
            transfer_job.latest_operation_name
        )
        operation = storage_transfer.TransferOperation.deserialize(
            response.metadata.value
        )

        print(f"Latest transfer operation for `{job_name}`: {operation}")
    else:
        print(f"Transfer job {job_name} has not ran yet.")