MutationsBatcher(table, flush_count=100, max_row_bytes=20971520, flush_interval=1)
A MutationsBatcher is used in batch cases where the number of mutations
is large or unknown. It will store DirectRow
in memory until one of the
size limits is reached, or an explicit call to flush()
is performed. When
a flush event occurs, the DirectRow
in memory will be sent to Cloud
Bigtable. Batching mutations is more efficient than sending individual
request.
This class is not suited for usage in systems where each mutation
must be guaranteed to be sent, since calling mutate may only result in an
in-memory change. In a case of a system crash, any DirectRow
remaining in
memory will not necessarily be sent to the service, even after the
completion of the mutate()
method.
Note on thread safety: The same MutationBatcher
cannot be shared by multiple end-user threads.
Parameters | |
---|---|
Name | Description |
table |
class
class: |
flush_count |
int
(Optional) Max number of rows to flush. If it reaches the max number of rows it calls finish_batch() to mutate the current row batch. Default is FLUSH_COUNT (1000 rows). |
max_row_bytes |
int
(Optional) Max number of row mutations size to flush. If it reaches the max number of row mutations size it calls finish_batch() to mutate the current row batch. Default is MAX_ROW_BYTES (5 MB). |
flush_interval |
float
(Optional) The interval (in seconds) between asynchronous flush. Default is 1 second. |
Methods
__enter__
__enter__()
Starting the MutationsBatcher as a context manager
__exit__
__exit__(exc_type, exc_value, exc_traceback)
Clean up resources. Flush and shutdown the ThreadPoolExecutor.
close
close()
Clean up resources. Flush and shutdown the ThreadPoolExecutor. Any errors will be raised.
Exceptions | |
---|---|
Type | Description |
| .batcherMutationsBatchError if there's any error in the mutations. |
flush
flush()
Sends the current batch to Cloud Bigtable synchronously. For example:
.. literalinclude:: snippets_table.py :start-after: [START bigtable_api_batcher_flush] :end-before: [END bigtable_api_batcher_flush] :dedent: 4
Exceptions | |
---|---|
Type | Description |
| .batcherMutationsBatchError if there's any error in the mutations. |
mutate
mutate(row)
Add a row to the batch. If the current batch meets one of the size limits, the batch is sent asynchronously.
For example:
.. literalinclude:: snippets_table.py :start-after: [START bigtable_api_batcher_mutate] :end-before: [END bigtable_api_batcher_mutate] :dedent: 4
Parameter | |
---|---|
Name | Description |
row |
class
|
Exceptions | |
---|---|
Type | Description |
On | of the following: * .table._BigtableRetryableError if any row returned a transient error. * RuntimeError if the number of responses doesn't match the number of rows that were retried |
mutate_rows
mutate_rows(rows)
Add multiple rows to the batch. If the current batch meets one of the size limits, the batch is sent asynchronously.
For example:
.. literalinclude:: snippets_table.py :start-after: [START bigtable_api_batcher_mutate_rows] :end-before: [END bigtable_api_batcher_mutate_rows] :dedent: 4
Parameter | |
---|---|
Name | Description |
rows |
list:[
list:[ |
Exceptions | |
---|---|
Type | Description |
On | of the following: * .table._BigtableRetryableError if any row returned a transient error. * RuntimeError if the number of responses doesn't match the number of rows that were retried |