Migrazione alla libreria client Cloud di Storage Transfer Service

Per garantire un'elevata qualità e per essere coerenti con le altre librerie Cloud, la documentazione di Storage Transfer Service ora utilizza le librerie client Cloud anziché le librerie client API di Google. Per ulteriori informazioni sulle due opzioni, consulta la sezione Descrizione delle librerie client.

La libreria client delle API di Google continua a ricevere aggiornamenti, ma non viene più citata nella documentazione.

Questa guida illustra le differenze principali relative all'utilizzo di Storage Transfer Service e fornisce istruzioni per aggiornare i client durante la migrazione alla libreria client di Cloud.

Java

Aggiornamento delle dipendenze

Per passare alla nuova libreria, sostituisci la dipendenza da google-api-services-storagetransfer con google-cloud-storage-transfer.

<dependency>
    <groupId>com.google.cloud</groupId>
    <artifactId>google-cloud-storage-transfer</artifactId>
    <version>0.2.3</version>
</dependency>

Se utilizzi Gradle senza BOM, aggiungi quanto segue alle dipendenze:

implementation 'com.google.cloud:google-cloud-storage-transfer:0.2.3'
<dependencyManagement>
  <dependencies>
    <dependency>
      <groupId>com.google.cloud</groupId>
      <artifactId>libraries-bom</artifactId>
      <version>24.1.0</version>
      <type>pom</type>
      <scope>import</scope>
    </dependency>
  </dependencies>
</dependencyManagement>

<dependencies>
  <dependency>
    <groupId>com.google.cloud</groupId>
    <artifactId>google-cloud-storage-transfer</artifactId>
  </dependency>

Per la maggior parte, il codice può essere convertito abbastanza facilmente dalla libreria client API alla libreria client Cloud. Di seguito sono riportate alcune differenze chiave tra i due client Java

Istanziazione del client

La libreria client di Cloud riduce gran parte del codice boilerplate associato all'inizializzazione del client gestendolo dietro le quinte.

Libreria client API

GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
if (credential.createScopedRequired()) {
 credential = credential.createScoped(StoragetransferScopes.all());
}
Storagetransfer storageTransfer = new Storagetransfer.Builder(Utils.getDefaultTransport(),
   Utils.getDefaultJsonFactory(), new HttpCredentialsAdapter(credential))
   .build();

Libreria client Cloud

StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();

Costruttori per le classi di modelli

Le classi di modelli nella libreria client di Cloud utilizzano i builder anziché i costruttori.

Libreria client API

TransferJob transferJob =
   new TransferJob()
       .setStatus("ENABLED");

Libreria client Cloud

TransferJob transferJob =
   TransferJob.newBuilder()
       .setStatus(Status.ENABLED)
   .build();

Le operazioni di elenco restituiscono elementi iterabili

Le operazioni di elenco nella libreria client di Cloud restituiscono semplici elementi iterabili anziché i risultati paginati nella libreria client dell'API.

Libreria client API

public class StoragetransferExample {
  public static void main(String args[]) throws IOException, GeneralSecurityException {
    Storagetransfer storagetransferService = createStoragetransferService();
    Storagetransfer.TransferJobs.List request = storagetransferService.transferJobs().list();

    ListTransferJobsResponse response;
    do {
      response = request.execute();
      if (response.getTransferJobs() == null) {
        continue;
      }
      for (TransferJob transferJob : response.getTransferJobs()) {
        System.out.println(transferJob);
      }
      request.setPageToken(response.getNextPageToken());
    } while (response.getNextPageToken() != null);
  }

  public static Storagetransfer createStoragetransferService()
      throws IOException, GeneralSecurityException {
    HttpTransport httpTransport = GoogleNetHttpTransport.newTrustedTransport();
    JsonFactory jsonFactory = JacksonFactory.getDefaultInstance();

    GoogleCredential credential = GoogleCredential.getApplicationDefault();
    }

    return new Storagetransfer.Builder(httpTransport, jsonFactory, credential)
        .build();
  }
}

Libreria client Cloud

public class StoragetransferExample {
 public static void main(String args[]) throws Exception {
   StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();
   ListTransferJobsRequest request = ListTransferJobsRequest.newBuilder().build();
   for (TransferJob job : client.listTransferJobs(request).iterateAll()) {
     System.out.println(job);
   }
 }
}

Confronti di esempio

Qui sono inclusi gli esempi della vecchia libreria client dell'API, rispetto ai relativi esempi equivalenti che utilizzano la libreria client di Cloud. Se hai già utilizzato questi esempi, puoi utilizzare questo confronto per capire come spostare il codice nella nuova libreria client Cloud.

Trasferimento da Amazon S3

Libreria client API


import com.google.api.client.googleapis.util.Utils;
import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.StoragetransferScopes;
import com.google.api.services.storagetransfer.v1.model.AwsAccessKey;
import com.google.api.services.storagetransfer.v1.model.AwsS3Data;
import com.google.api.services.storagetransfer.v1.model.Date;
import com.google.api.services.storagetransfer.v1.model.GcsData;
import com.google.api.services.storagetransfer.v1.model.Schedule;
import com.google.api.services.storagetransfer.v1.model.TimeOfDay;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.api.services.storagetransfer.v1.model.TransferSpec;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;
import java.util.Calendar;

public class TransferFromAwsApiary {

  // Creates a one-off transfer job from Amazon S3 to Google Cloud Storage.
  public static void transferFromAws(
      String projectId,
      String jobDescription,
      String awsSourceBucket,
      String gcsSinkBucket,
      long startDateTime)
      throws IOException {

    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // A short description of this job
    // String jobDescription = "Sample transfer job from S3 to GCS.";

    // The name of the source AWS bucket to transfer data from
    // String awsSourceBucket = "yourAwsSourceBucket";

    // The name of the GCS bucket to transfer data to
    // String gcsSinkBucket = "your-gcs-bucket";

    // What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
    // If this is in the past relative to when the job is created, it will run the next day.
    // long startDateTime =
    //     new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();

    // The ID used to access your AWS account. Should be accessed via environment variable.
    String awsAccessKeyId = System.getenv("AWS_ACCESS_KEY_ID");

    // The Secret Key used to access your AWS account. Should be accessed via environment variable.
    String awsSecretAccessKey = System.getenv("AWS_SECRET_ACCESS_KEY");

    // Set up source and sink
    TransferSpec transferSpec =
        new TransferSpec()
            .setAwsS3DataSource(
                new AwsS3Data()
                    .setBucketName(awsSourceBucket)
                    .setAwsAccessKey(
                        new AwsAccessKey()
                            .setAccessKeyId(awsAccessKeyId)
                            .setSecretAccessKey(awsSecretAccessKey)))
            .setGcsDataSink(new GcsData().setBucketName(gcsSinkBucket));

    // Parse epoch timestamp into the model classes
    Calendar startCalendar = Calendar.getInstance();
    startCalendar.setTimeInMillis(startDateTime);
    // Note that this is a Date from the model class package, not a java.util.Date
    Date startDate =
        new Date()
            .setYear(startCalendar.get(Calendar.YEAR))
            .setMonth(startCalendar.get(Calendar.MONTH) + 1)
            .setDay(startCalendar.get(Calendar.DAY_OF_MONTH));
    TimeOfDay startTime =
        new TimeOfDay()
            .setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
            .setMinutes(startCalendar.get(Calendar.MINUTE))
            .setSeconds(startCalendar.get(Calendar.SECOND));
    Schedule schedule =
        new Schedule()
            .setScheduleStartDate(startDate)
            .setScheduleEndDate(startDate)
            .setStartTimeOfDay(startTime);

    // Set up the transfer job
    TransferJob transferJob =
        new TransferJob()
            .setDescription(jobDescription)
            .setProjectId(projectId)
            .setTransferSpec(transferSpec)
            .setSchedule(schedule)
            .setStatus("ENABLED");

    // Create a Transfer Service client
    GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
    if (credential.createScopedRequired()) {
      credential = credential.createScoped(StoragetransferScopes.all());
    }
    Storagetransfer storageTransfer =
        new Storagetransfer.Builder(
                Utils.getDefaultTransport(),
                Utils.getDefaultJsonFactory(),
                new HttpCredentialsAdapter(credential))
            .build();

    // Create the transfer job
    TransferJob response = storageTransfer.transferJobs().create(transferJob).execute();

    System.out.println("Created transfer job from AWS to GCS:");
    System.out.println(response.toPrettyString());
  }
}

Libreria client Cloud


import com.google.storagetransfer.v1.proto.StorageTransferServiceClient;
import com.google.storagetransfer.v1.proto.TransferProto.CreateTransferJobRequest;
import com.google.storagetransfer.v1.proto.TransferTypes.AwsAccessKey;
import com.google.storagetransfer.v1.proto.TransferTypes.AwsS3Data;
import com.google.storagetransfer.v1.proto.TransferTypes.GcsData;
import com.google.storagetransfer.v1.proto.TransferTypes.Schedule;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob.Status;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferSpec;
import com.google.type.Date;
import com.google.type.TimeOfDay;
import java.io.IOException;
import java.util.Calendar;

public class TransferFromAws {

  // Creates a one-off transfer job from Amazon S3 to Google Cloud Storage.
  public static void transferFromAws(
      String projectId,
      String jobDescription,
      String awsSourceBucket,
      String gcsSinkBucket,
      long startDateTime)
      throws IOException {

    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // A short description of this job
    // String jobDescription = "Sample transfer job from S3 to GCS.";

    // The name of the source AWS bucket to transfer data from
    // String awsSourceBucket = "yourAwsSourceBucket";

    // The name of the GCS bucket to transfer data to
    // String gcsSinkBucket = "your-gcs-bucket";

    // What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
    // If this is in the past relative to when the job is created, it will run the next day.
    // long startDateTime =
    //     new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();

    // The ID used to access your AWS account. Should be accessed via environment variable.
    String awsAccessKeyId = System.getenv("AWS_ACCESS_KEY_ID");

    // The Secret Key used to access your AWS account. Should be accessed via environment variable.
    String awsSecretAccessKey = System.getenv("AWS_SECRET_ACCESS_KEY");

    // Set up source and sink
    TransferSpec transferSpec =
        TransferSpec.newBuilder()
            .setAwsS3DataSource(
                AwsS3Data.newBuilder()
                    .setBucketName(awsSourceBucket)
                    .setAwsAccessKey(
                        AwsAccessKey.newBuilder()
                            .setAccessKeyId(awsAccessKeyId)
                            .setSecretAccessKey(awsSecretAccessKey)))
            .setGcsDataSink(GcsData.newBuilder().setBucketName(gcsSinkBucket))
            .build();

    // Parse epoch timestamp into the model classes
    Calendar startCalendar = Calendar.getInstance();
    startCalendar.setTimeInMillis(startDateTime);
    // Note that this is a Date from the model class package, not a java.util.Date
    Date startDate =
        Date.newBuilder()
            .setYear(startCalendar.get(Calendar.YEAR))
            .setMonth(startCalendar.get(Calendar.MONTH) + 1)
            .setDay(startCalendar.get(Calendar.DAY_OF_MONTH))
            .build();
    TimeOfDay startTime =
        TimeOfDay.newBuilder()
            .setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
            .setMinutes(startCalendar.get(Calendar.MINUTE))
            .setSeconds(startCalendar.get(Calendar.SECOND))
            .build();
    Schedule schedule =
        Schedule.newBuilder()
            .setScheduleStartDate(startDate)
            .setScheduleEndDate(startDate)
            .setStartTimeOfDay(startTime)
            .build();

    // Set up the transfer job
    TransferJob transferJob =
        TransferJob.newBuilder()
            .setDescription(jobDescription)
            .setProjectId(projectId)
            .setTransferSpec(transferSpec)
            .setSchedule(schedule)
            .setStatus(Status.ENABLED)
            .build();

    // Create a Transfer Service client
    StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();

    // Create the transfer job
    TransferJob response =
        storageTransfer.createTransferJob(
            CreateTransferJobRequest.newBuilder().setTransferJob(transferJob).build());

    System.out.println("Created transfer job from AWS to GCS:");
    System.out.println(response.toString());
  }
}

Trasferimento a nearline

Libreria client API

import com.google.api.client.googleapis.util.Utils;
import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.StoragetransferScopes;
import com.google.api.services.storagetransfer.v1.model.Date;
import com.google.api.services.storagetransfer.v1.model.GcsData;
import com.google.api.services.storagetransfer.v1.model.ObjectConditions;
import com.google.api.services.storagetransfer.v1.model.Schedule;
import com.google.api.services.storagetransfer.v1.model.TimeOfDay;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.api.services.storagetransfer.v1.model.TransferOptions;
import com.google.api.services.storagetransfer.v1.model.TransferSpec;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;
import java.util.Calendar;

public class TransferToNearlineApiary {
  /**
   * Creates a one-off transfer job that transfers objects in a standard GCS bucket that are more
   * than 30 days old to a Nearline GCS bucket.
   */
  public static void transferToNearlineApiary(
      String projectId,
      String jobDescription,
      String gcsSourceBucket,
      String gcsNearlineSinkBucket,
      long startDateTime)
      throws IOException {

    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // A short description of this job
    // String jobDescription = "Sample transfer job of old objects to a Nearline GCS bucket.";

    // The name of the source GCS bucket to transfer data from
    // String gcsSourceBucket = "your-gcs-source-bucket";

    // The name of the Nearline GCS bucket to transfer old objects to
    // String gcsSinkBucket = "your-nearline-gcs-bucket";

    // What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
    // If this is in the past relative to when the job is created, it will run the next day.
    // long startDateTime =
    //     new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();

    // Parse epoch timestamp into the model classes
    Calendar startCalendar = Calendar.getInstance();
    startCalendar.setTimeInMillis(startDateTime);
    // Note that this is a Date from the model class package, not a java.util.Date
    Date date =
        new Date()
            .setYear(startCalendar.get(Calendar.YEAR))
            .setMonth(startCalendar.get(Calendar.MONTH) + 1)
            .setDay(startCalendar.get(Calendar.DAY_OF_MONTH));
    TimeOfDay time =
        new TimeOfDay()
            .setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
            .setMinutes(startCalendar.get(Calendar.MINUTE))
            .setSeconds(startCalendar.get(Calendar.SECOND));

    TransferJob transferJob =
        new TransferJob()
            .setDescription(jobDescription)
            .setProjectId(projectId)
            .setTransferSpec(
                new TransferSpec()
                    .setGcsDataSource(new GcsData().setBucketName(gcsSourceBucket))
                    .setGcsDataSink(new GcsData().setBucketName(gcsNearlineSinkBucket))
                    .setObjectConditions(
                        new ObjectConditions()
                            .setMinTimeElapsedSinceLastModification("2592000s" /* 30 days */))
                    .setTransferOptions(
                        new TransferOptions().setDeleteObjectsFromSourceAfterTransfer(true)))
            .setSchedule(new Schedule().setScheduleStartDate(date).setStartTimeOfDay(time))
            .setStatus("ENABLED");

    // Create a Transfer Service client
    GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
    if (credential.createScopedRequired()) {
      credential = credential.createScoped(StoragetransferScopes.all());
    }
    Storagetransfer storageTransfer =
        new Storagetransfer.Builder(
                Utils.getDefaultTransport(),
                Utils.getDefaultJsonFactory(),
                new HttpCredentialsAdapter(credential))
            .build();

    // Create the transfer job
    TransferJob response = storageTransfer.transferJobs().create(transferJob).execute();

    System.out.println("Created transfer job from standard bucket to Nearline bucket:");
    System.out.println(response.toPrettyString());
  }
}

Libreria client Cloud

import com.google.protobuf.Duration;
import com.google.storagetransfer.v1.proto.StorageTransferServiceClient;
import com.google.storagetransfer.v1.proto.TransferProto.CreateTransferJobRequest;
import com.google.storagetransfer.v1.proto.TransferTypes.GcsData;
import com.google.storagetransfer.v1.proto.TransferTypes.ObjectConditions;
import com.google.storagetransfer.v1.proto.TransferTypes.Schedule;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob.Status;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferOptions;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferSpec;
import com.google.type.Date;
import com.google.type.TimeOfDay;
import java.io.IOException;
import java.util.Calendar;

public class TransferToNearline {
  /**
   * Creates a one-off transfer job that transfers objects in a standard GCS bucket that are more
   * than 30 days old to a Nearline GCS bucket.
   */
  public static void transferToNearline(
      String projectId,
      String jobDescription,
      String gcsSourceBucket,
      String gcsNearlineSinkBucket,
      long startDateTime)
      throws IOException {

    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // A short description of this job
    // String jobDescription = "Sample transfer job of old objects to a Nearline GCS bucket.";

    // The name of the source GCS bucket to transfer data from
    // String gcsSourceBucket = "your-gcs-source-bucket";

    // The name of the Nearline GCS bucket to transfer old objects to
    // String gcsSinkBucket = "your-nearline-gcs-bucket";

    // What day and time in UTC to start the transfer, expressed as an epoch date timestamp.
    // If this is in the past relative to when the job is created, it will run the next day.
    // long startDateTime =
    //     new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2000-01-01 00:00:00").getTime();

    // Parse epoch timestamp into the model classes
    Calendar startCalendar = Calendar.getInstance();
    startCalendar.setTimeInMillis(startDateTime);
    // Note that this is a Date from the model class package, not a java.util.Date
    Date date =
        Date.newBuilder()
            .setYear(startCalendar.get(Calendar.YEAR))
            .setMonth(startCalendar.get(Calendar.MONTH) + 1)
            .setDay(startCalendar.get(Calendar.DAY_OF_MONTH))
            .build();
    TimeOfDay time =
        TimeOfDay.newBuilder()
            .setHours(startCalendar.get(Calendar.HOUR_OF_DAY))
            .setMinutes(startCalendar.get(Calendar.MINUTE))
            .setSeconds(startCalendar.get(Calendar.SECOND))
            .build();

    TransferJob transferJob =
        TransferJob.newBuilder()
            .setDescription(jobDescription)
            .setProjectId(projectId)
            .setTransferSpec(
                TransferSpec.newBuilder()
                    .setGcsDataSource(GcsData.newBuilder().setBucketName(gcsSourceBucket))
                    .setGcsDataSink(GcsData.newBuilder().setBucketName(gcsNearlineSinkBucket))
                    .setObjectConditions(
                        ObjectConditions.newBuilder()
                            .setMinTimeElapsedSinceLastModification(
                                Duration.newBuilder().setSeconds(2592000 /* 30 days */)))
                    .setTransferOptions(
                        TransferOptions.newBuilder().setDeleteObjectsFromSourceAfterTransfer(true)))
            .setSchedule(Schedule.newBuilder().setScheduleStartDate(date).setStartTimeOfDay(time))
            .setStatus(Status.ENABLED)
            .build();

    // Create a Transfer Service client
    StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();

    // Create the transfer job
    TransferJob response =
        storageTransfer.createTransferJob(
            CreateTransferJobRequest.newBuilder().setTransferJob(transferJob).build());

    System.out.println("Created transfer job from standard bucket to Nearline bucket:");
    System.out.println(response.toString());
  }
}

Controllare l'ultima operazione di trasferimento

Libreria client API


import com.google.api.client.googleapis.util.Utils;
import com.google.api.services.storagetransfer.v1.Storagetransfer;
import com.google.api.services.storagetransfer.v1.StoragetransferScopes;
import com.google.api.services.storagetransfer.v1.model.Operation;
import com.google.api.services.storagetransfer.v1.model.TransferJob;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;

public class CheckLatestTransferOperationApiary {

  // Gets the requested transfer job and checks its latest operation
  public static void checkLatestTransferOperationApiary(String projectId, String jobName)
      throws IOException {
    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // The name of the job to check
    // String jobName = "myJob/1234567890";

    // Create Storage Transfer client
    GoogleCredentials credential = GoogleCredentials.getApplicationDefault();
    if (credential.createScopedRequired()) {
      credential = credential.createScoped(StoragetransferScopes.all());
    }
    Storagetransfer storageTransfer =
        new Storagetransfer.Builder(
                Utils.getDefaultTransport(),
                Utils.getDefaultJsonFactory(),
                new HttpCredentialsAdapter(credential))
            .build();

    // Get transfer job and check latest operation
    TransferJob transferJob = storageTransfer.transferJobs().get(jobName, projectId).execute();
    String latestOperationName = transferJob.getLatestOperationName();

    if (latestOperationName != null) {
      Operation latestOperation =
          storageTransfer.transferOperations().get(latestOperationName).execute();
      System.out.println("The latest operation for transfer job " + jobName + " is:");
      System.out.println(latestOperation.toPrettyString());

    } else {
      System.out.println(
          "Transfer job "
              + jobName
              + " does not have an operation scheduled yet,"
              + " try again once the job starts running.");
    }
  }
}

Libreria client Cloud


import com.google.longrunning.Operation;
import com.google.storagetransfer.v1.proto.StorageTransferServiceClient;
import com.google.storagetransfer.v1.proto.TransferProto.GetTransferJobRequest;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferJob;
import com.google.storagetransfer.v1.proto.TransferTypes.TransferOperation;
import java.io.IOException;

public class CheckLatestTransferOperation {

  // Gets the requested transfer job and checks its latest operation
  public static void checkLatestTransferOperation(String projectId, String jobName)
      throws IOException {
    // Your Google Cloud Project ID
    // String projectId = "your-project-id";

    // The name of the job to check
    // String jobName = "myJob/1234567890";

    StorageTransferServiceClient storageTransfer = StorageTransferServiceClient.create();

    // Get transfer job and check latest operation
    TransferJob transferJob =
        storageTransfer.getTransferJob(
            GetTransferJobRequest.newBuilder().setJobName(jobName).setProjectId(projectId).build());
    String latestOperationName = transferJob.getLatestOperationName();

    if (!latestOperationName.isEmpty()) {
      Operation operation = storageTransfer.getOperationsClient().getOperation(latestOperationName);
      TransferOperation latestOperation =
          TransferOperation.parseFrom(operation.getMetadata().getValue());

      System.out.println("The latest operation for transfer job " + jobName + " is:");
      System.out.println(latestOperation.toString());

    } else {
      System.out.println(
          "Transfer job "
              + jobName
              + " hasn't run yet,"
              + " try again once the job starts running.");
    }
  }
}

Python

Aggiornamento delle dipendenze

Per utilizzare la nuova libreria, aggiungi una dipendenza da google-cloud-storage-transfer. Verrà utilizzato al posto del client discovery di google-api-python-client.

pip install --upgrade google-cloud-storage-transfer

Istanziazione del client

Utilizza il modulo storage_transfer anziché googleapiclient.discovery.

Libreria client API

"""A sample for creating a Storage Transfer Service client."""

import googleapiclient.discovery


def create_transfer_client():
    return googleapiclient.discovery.build("storagetransfer", "v1")

Libreria client Cloud

"""A sample for creating a Storage Transfer Service client."""

from google.cloud import storage_transfer


def create_transfer_client():
    return storage_transfer.StorageTransferServiceClient()

Confronti di esempio

Per illustrare le differenze tra le due librerie, di seguito sono riportati i vecchi esempi di client API affiancati ai relativi equivalenti nella libreria client di Cloud.

Trasferimento da Amazon S3

Libreria client API

def main(
    description,
    project_id,
    start_date,
    start_time,
    source_bucket,
    access_key_id,
    secret_access_key,
    sink_bucket,
):
    """Create a one-time transfer from Amazon S3 to Google Cloud Storage."""
    storagetransfer = googleapiclient.discovery.build("storagetransfer", "v1")

    # Edit this template with desired parameters.
    transfer_job = {
        "description": description,
        "status": "ENABLED",
        "projectId": project_id,
        "schedule": {
            "scheduleStartDate": {
                "day": start_date.day,
                "month": start_date.month,
                "year": start_date.year,
            },
            "scheduleEndDate": {
                "day": start_date.day,
                "month": start_date.month,
                "year": start_date.year,
            },
            "startTimeOfDay": {
                "hours": start_time.hour,
                "minutes": start_time.minute,
                "seconds": start_time.second,
            },
        },
        "transferSpec": {
            "awsS3DataSource": {
                "bucketName": source_bucket,
                "awsAccessKey": {
                    "accessKeyId": access_key_id,
                    "secretAccessKey": secret_access_key,
                },
            },
            "gcsDataSink": {"bucketName": sink_bucket},
        },
    }

    result = storagetransfer.transferJobs().create(body=transfer_job).execute()
    print("Returned transferJob: {}".format(json.dumps(result, indent=4)))

Libreria client Cloud

from datetime import datetime

from google.cloud import storage_transfer


def create_one_time_aws_transfer(
    project_id: str,
    description: str,
    source_bucket: str,
    aws_access_key_id: str,
    aws_secret_access_key: str,
    sink_bucket: str,
):
    """Creates a one-time transfer job from Amazon S3 to Google Cloud
    Storage."""

    client = storage_transfer.StorageTransferServiceClient()

    # The ID of the Google Cloud Platform Project that owns the job
    # project_id = 'my-project-id'

    # A useful description for your transfer job
    # description = 'My transfer job'

    # AWS S3 source bucket name
    # source_bucket = 'my-s3-source-bucket'

    # AWS Access Key ID
    # aws_access_key_id = 'AKIA...'

    # AWS Secret Access Key
    # aws_secret_access_key = 'HEAoMK2.../...ku8'

    # Google Cloud Storage destination bucket name
    # sink_bucket = 'my-gcs-destination-bucket'

    now = datetime.utcnow()
    # Setting the start date and the end date as
    # the same time creates a one-time transfer
    one_time_schedule = {"day": now.day, "month": now.month, "year": now.year}

    transfer_job_request = storage_transfer.CreateTransferJobRequest(
        {
            "transfer_job": {
                "project_id": project_id,
                "description": description,
                "status": storage_transfer.TransferJob.Status.ENABLED,
                "schedule": {
                    "schedule_start_date": one_time_schedule,
                    "schedule_end_date": one_time_schedule,
                },
                "transfer_spec": {
                    "aws_s3_data_source": {
                        "bucket_name": source_bucket,
                        "aws_access_key": {
                            "access_key_id": aws_access_key_id,
                            "secret_access_key": aws_secret_access_key,
                        },
                    },
                    "gcs_data_sink": {
                        "bucket_name": sink_bucket,
                    },
                },
            }
        }
    )

    result = client.create_transfer_job(transfer_job_request)
    print(f"Created transferJob: {result.name}")

Trasferimento a nearline

Libreria client API

def main(description, project_id, start_date, start_time, source_bucket, sink_bucket):
    """Create a daily transfer from Standard to Nearline Storage class."""
    storagetransfer = googleapiclient.discovery.build("storagetransfer", "v1")

    # Edit this template with desired parameters.
    transfer_job = {
        "description": description,
        "status": "ENABLED",
        "projectId": project_id,
        "schedule": {
            "scheduleStartDate": {
                "day": start_date.day,
                "month": start_date.month,
                "year": start_date.year,
            },
            "startTimeOfDay": {
                "hours": start_time.hour,
                "minutes": start_time.minute,
                "seconds": start_time.second,
            },
        },
        "transferSpec": {
            "gcsDataSource": {"bucketName": source_bucket},
            "gcsDataSink": {"bucketName": sink_bucket},
            "objectConditions": {
                "minTimeElapsedSinceLastModification": "2592000s"  # 30 days
            },
            "transferOptions": {"deleteObjectsFromSourceAfterTransfer": "true"},
        },
    }

    result = storagetransfer.transferJobs().create(body=transfer_job).execute()
    print("Returned transferJob: {}".format(json.dumps(result, indent=4)))

Libreria client Cloud

Nota l'importazione di google.protobuf.duration_pb2.Duration.

from datetime import datetime

from google.cloud import storage_transfer
from google.protobuf.duration_pb2 import Duration


def create_daily_nearline_30_day_migration(
    project_id: str,
    description: str,
    source_bucket: str,
    sink_bucket: str,
    start_date: datetime,
):
    """Create a daily migration from a GCS bucket to a Nearline GCS bucket
    for objects untouched for 30 days."""

    client = storage_transfer.StorageTransferServiceClient()

    # The ID of the Google Cloud Platform Project that owns the job
    # project_id = 'my-project-id'

    # A useful description for your transfer job
    # description = 'My transfer job'

    # Google Cloud Storage source bucket name
    # source_bucket = 'my-gcs-source-bucket'

    # Google Cloud Storage destination bucket name
    # sink_bucket = 'my-gcs-destination-bucket'

    transfer_job_request = storage_transfer.CreateTransferJobRequest(
        {
            "transfer_job": {
                "project_id": project_id,
                "description": description,
                "status": storage_transfer.TransferJob.Status.ENABLED,
                "schedule": {
                    "schedule_start_date": {
                        "day": start_date.day,
                        "month": start_date.month,
                        "year": start_date.year,
                    }
                },
                "transfer_spec": {
                    "gcs_data_source": {
                        "bucket_name": source_bucket,
                    },
                    "gcs_data_sink": {
                        "bucket_name": sink_bucket,
                    },
                    "object_conditions": {
                        "min_time_elapsed_since_last_modification": Duration(
                            seconds=2592000  # 30 days
                        )
                    },
                    "transfer_options": {
                        "delete_objects_from_source_after_transfer": True
                    },
                },
            }
        }
    )

    result = client.create_transfer_job(transfer_job_request)
    print(f"Created transferJob: {result.name}")

Controllare l'ultima operazione di trasferimento

Libreria client API


"""Command-line sample that checks the latest operation of a transfer.
This sample is used on this page:
    https://cloud.google.com/storage/transfer/create-transfer
For more information, see README.md.
"""

import argparse
import json

import googleapiclient.discovery


def check_latest_transfer_operation(project_id, job_name):
    """Check the latest transfer operation associated with a transfer job."""
    storagetransfer = googleapiclient.discovery.build("storagetransfer", "v1")

    transferJob = (
        storagetransfer.transferJobs()
        .get(projectId=project_id, jobName=job_name)
        .execute()
    )
    latestOperationName = transferJob.get("latestOperationName")

    if latestOperationName:
        result = (
            storagetransfer.transferOperations().get(name=latestOperationName).execute()
        )
        print(
            "The latest operation for job"
            + job_name
            + " is: {}".format(json.dumps(result, indent=4, sort_keys=True))
        )

    else:
        print(
            "Transfer job "
            + job_name
            + " does not have an operation scheduled yet, "
            + "try again once the job starts running."
        )


if __name__ == "__main__":
    parser = argparse.ArgumentParser(
        description=__doc__, formatter_class=argparse.RawDescriptionHelpFormatter
    )
    parser.add_argument("project_id", help="Your Google Cloud project ID.")
    parser.add_argument("job_name", help="Your job name.")

    args = parser.parse_args()

    check_latest_transfer_operation(args.project_id, args.job_name)

Libreria client Cloud

Nota l'utilizzo di storage_transfer.TransferOperation.deserialize

from google.cloud import storage_transfer


def check_latest_transfer_operation(project_id: str, job_name: str):
    """Checks the latest transfer operation for a given transfer job."""

    client = storage_transfer.StorageTransferServiceClient()

    # The ID of the Google Cloud Platform Project that owns the job
    # project_id = 'my-project-id'

    # Storage Transfer Service job name
    # job_name = 'transferJobs/1234567890'

    transfer_job = client.get_transfer_job(
        {
            "project_id": project_id,
            "job_name": job_name,
        }
    )

    if transfer_job.latest_operation_name:
        response = client.transport.operations_client.get_operation(
            transfer_job.latest_operation_name
        )
        operation = storage_transfer.TransferOperation.deserialize(
            response.metadata.value
        )

        print(f"Latest transfer operation for `{job_name}`: {operation}")
    else:
        print(f"Transfer job {job_name} has not ran yet.")