Usage Guide
Writing log entries
To write log entries, first create a
Logger
, passing the “log name” with
which to associate the entries:
logger = client.logger(LOG_NAME)
Write a simple text entry to the logger.
logger.log_text("A simple entry") # API call
Write a dictionary entry to the logger.
logger.log_struct(
{"message": "My second entry", "weather": "partly cloudy"}
) # API call
Write a simple text entry and resource to the logger.
Supported Resource values are listed at Monitored Resource Types
from google.cloud.logging.resource import Resource
res = Resource(
type="generic_node",
labels={
"location": "us-central1-a",
"namespace": "default",
"node_id": "10.10.10.1",
},
)
logger.log_struct(
{"message": "My first entry", "weather": "partly cloudy"}, resource=res
)
Retrieving log entries
Fetch entries for the default project.
for entry in client.list_entries(): # API call(s)
do_something_with(entry)
Entries returned by
Client.list_entries
or
Logger.list_entries
will be instances of one of the following classes:
TextEntry
StructEntry
ProtobufEntry
Fetch entries across multiple projects.
resource_names = ["projects/one-project", "projects/another-project"]
for entry in client.list_entries(resource_names=resource_names): # API call(s)
do_something_with(entry)
Filter entries retrieved using the Advanced Logs Filters syntax
Fetch entries for the default project.
FILTER = "logName:log_name AND textPayload:simple"
for entry in client.list_entries(filter_=FILTER): # API call(s)
do_something_with(entry)
Sort entries in descending timestamp order.
from google.cloud.logging import DESCENDING
for entry in client.list_entries(order_by=DESCENDING): # API call(s)
do_something_with(entry)
Retrieve entries in batches of 10, iterating until done.
iterator = client.list_entries()
pages = iterator.pages
page1 = next(pages) # API call
for entry in page1:
do_something_with(entry)
page2 = next(pages) # API call
for entry in page2:
do_something_with(entry)
Retrieve entries for a single logger, sorting in descending timestamp order:
from google.cloud.logging import DESCENDING
for entry in logger.list_entries(order_by=DESCENDING): # API call(s)
do_something_with(entry)
Delete all entries for a logger
logger.delete() # API call
Manage log metrics
Metrics are counters of entries which match a given filter. They can be used within Cloud Monitoring to create charts and alerts.
List all metrics for a project:
for metric in client.list_metrics(): # API call(s)
do_something_with(metric)
Create a metric:
metric = client.metric(METRIC_NAME, filter_=FILTER, description=DESCRIPTION)
assert not metric.exists() # API call
metric.create() # API call
assert metric.exists() # API call
Refresh local information about a metric:
existing_metric = client.metric(METRIC_NAME)
existing_metric.reload() # API call
Update a metric:
existing_metric.filter_ = UPDATED_FILTER
existing_metric.description = UPDATED_DESCRIPTION
existing_metric.update() # API call
Delete a metric:
metric.delete()
Export log entries using sinks
Sinks allow exporting entries which match a given filter to Cloud Storage buckets, BigQuery datasets, or Cloud Pub/Sub topics.
Export to Cloud Storage
Make sure that the storage bucket you want to export logs too has
cloud-logs@google.com
as the owner. See
Setting permissions for Cloud Storage.
Add cloud-logs@google.com
as the owner of the bucket:
bucket.acl.reload() # API call
logs_group = bucket.acl.group("cloud-logs@google.com")
logs_group.grant_owner()
bucket.acl.add_entity(logs_group)
bucket.acl.save() # API call
Create a Cloud Storage sink:
DESTINATION = "storage.googleapis.com/%s" % (bucket.name,)
sink = client.sink(SINK_NAME, filter_=FILTER, destination=DESTINATION)
assert not sink.exists() # API call
sink.create() # API call
assert sink.exists() # API call
Export to BigQuery
To export logs to BigQuery you must log into the Cloud Platform Console
and add cloud-logs@google.com
to a dataset.
See: Setting permissions for BigQuery
from google.cloud.bigquery.dataset import AccessGrant
grants = dataset.access_grants
grants.append(AccessGrant("WRITER", "groupByEmail", "cloud-logs@google.com"))
dataset.access_grants = grants
dataset.update() # API call
Create a BigQuery sink:
DESTINATION = "bigquery.googleapis.com%s" % (dataset.path,)
sink = client.sink(SINK_NAME, filter_=FILTER, destination=DESTINATION)
assert not sink.exists() # API call
sink.create() # API call
assert sink.exists() # API call
Export to Pub/Sub
To export logs to BigQuery you must log into the Cloud Platform Console
and add cloud-logs@google.com
to a topic.
See: Setting permissions for Pub/Sub
policy = topic.get_iam_policy() # API call
policy.owners.add(policy.group("cloud-logs@google.com"))
topic.set_iam_policy(policy) # API call
Create a Cloud Pub/Sub sink:
DESTINATION = "pubsub.googleapis.com/%s" % (topic.full_name,)
sink = client.sink(SINK_NAME, filter_=FILTER, destination=DESTINATION)
assert not sink.exists() # API call
sink.create() # API call
assert sink.exists() # API call
Manage Sinks
List all sinks for a project:
for sink in client.list_sinks(): # API call(s)
do_something_with(sink)
Refresh local information about a sink:
existing_sink = client.sink(SINK_NAME)
existing_sink.reload()
Update a sink:
existing_sink.filter_ = UPDATED_FILTER
existing_sink.update()
Delete a sink:
sink.delete()
Integration with Python logging module
It’s possible to tie the Python logging
module directly into Google
Cloud Logging. There are different handler options to accomplish this.
To automatically pick the default for your current environment, use
get_default_handler()
.
import logging
handler = client.get_default_handler()
cloud_logger = logging.getLogger("cloudLogger")
cloud_logger.setLevel(logging.INFO)
cloud_logger.addHandler(handler)
cloud_logger.error("bad news")
It is also possible to attach the handler to the root Python logger, so that
for example a plain logging.warn
call would be sent to Cloud Logging,
as well as any other loggers created. A helper method
setup_logging()
is provided
to configure this automatically.
client.setup_logging(log_level=logging.INFO)
NOTE: To reduce cost and quota usage, do not enable Cloud Logging handlers while testing locally.
You can also exclude certain loggers:
client.setup_logging(log_level=logging.INFO, excluded_loggers=("werkzeug",))
Cloud Logging Handler
If you prefer not to use
get_default_handler()
, you can
directly create a
CloudLoggingHandler
instance
which will write directly to the API.
from google.cloud.logging.handlers import CloudLoggingHandler
handler = CloudLoggingHandler(client)
cloud_logger = logging.getLogger("cloudLogger")
cloud_logger.setLevel(logging.INFO)
cloud_logger.addHandler(handler)
cloud_logger.error("bad news")
NOTE: This handler by default uses an asynchronous transport that sends log entries on a background thread. However, the API call will still be made in the same process. For other transport options, see the transports section.
All logs will go to a single custom log, which defaults to “python”. The name of the Python logger will be included in the structured log entry under the “python_logger” field. You can change it by providing a name to the handler:
handler = CloudLoggingHandler(client, name="mycustomlog")
Cloud Logging Handler transports
The CloudLoggingHandler
logging handler can use different transports. The default is
BackgroundThreadTransport
.
BackgroundThreadTransport
this is the default. It writes entries on a backgroundpython.threading.Thread
.
SyncTransport
this handler does a direct API call on each logging statement to write the entry.
fluentd logging handlers
Besides CloudLoggingHandler
,
which writes directly to the API, two other handlers are provided.
AppEngineHandler
, which is
recommended when running on the Google App Engine Flexible vanilla runtimes
(i.e. your app.yaml contains runtime: python
), and
ContainerEngineHandler
, which is recommended when running on Google Kubernetes Engine with the
Cloud Logging plugin enabled.
get_default_handler()
and
setup_logging()
will attempt to use
the environment to automatically detect whether the code is running in
these platforms and use the appropriate handler.
In both cases, the fluentd agent is configured to automatically parse log files in an expected format and forward them to Cloud Logging. The handlers provided help set the correct metadata such as log level so that logs can be filtered accordingly.