Usage Guide
Writing log entries
To write log entries, first create a
Logger
, passing the “log name” with
which to associate the entries:
logger = client.logger(log_name)
Write a simple text entry to the logger.
logger.log_text("A simple entry") # API call
Write a dictionary entry to the logger.
logger.log_struct(
{"message": "My second entry", "weather": "partly cloudy"}
) # API call
Write a simple text entry and resource to the logger.
Supported Resource values are listed at Monitored Resource Types
from google.cloud.logging import Resource
res = Resource(
type="generic_node",
labels={
"location": "us-central1-a",
"namespace": "default",
"node_id": "10.10.10.1",
},
)
logger.log_struct(
{"message": "My first entry", "weather": "partly cloudy"}, resource=res
)
Retrieving log entries
Fetch entries for the default project.
for entry in client.list_entries(): # API call(s)
do_something_with(entry)
Entries returned by
Client.list_entries
or
Logger.list_entries
will be instances of one of the following classes:
TextEntry
StructEntry
ProtobufEntry
Filter entries retrieved using the Advanced Logs Filters syntax
Fetch entries for the default project.
filter_str = "logName:log_name AND textPayload:simple"
for entry in client.list_entries(filter_=filter_str): # API call(s)
do_something_with(entry)
Sort entries in descending timestamp order.
from google.cloud.logging import DESCENDING
for entry in client.list_entries(order_by=DESCENDING): # API call(s)
do_something_with(entry)
Retrieve entries for a single logger, sorting in descending timestamp order:
from google.cloud.logging import DESCENDING
for entry in logger.list_entries(order_by=DESCENDING): # API call(s)
do_something_with(entry)
Delete all entries for a logger
logger.delete() # API call
Manage log metrics
Metrics are counters of entries which match a given filter. They can be used within Cloud Monitoring to create charts and alerts.
List all metrics for a project:
for metric in client.list_metrics(): # API call(s)
do_something_with(metric)
Create a metric:
metric = client.metric(metric_name, filter_=filter, description=description)
assert not metric.exists() # API call
metric.create() # API call
assert metric.exists() # API call
Refresh local information about a metric:
existing_metric = client.metric(metric_name)
existing_metric.reload() # API call
Update a metric:
existing_metric.filter_ = updated_filter
existing_metric.description = updated_description
existing_metric.update() # API call
Delete a metric:
metric.delete()
Export log entries using sinks
Sinks allow exporting entries which match a given filter to Cloud Storage buckets, BigQuery datasets, or Cloud Pub/Sub topics.
Export to Cloud Storage
Make sure that the storage bucket you want to export logs too has
cloud-logs@google.com
as the owner. See
Setting permissions for Cloud Storage.
Add cloud-logs@google.com
as the owner of the bucket:
bucket.acl.reload() # API call
logs_group = bucket.acl.group("cloud-logs@google.com")
logs_group.grant_owner()
bucket.acl.add_entity(logs_group)
bucket.acl.save() # API call
Create a Cloud Storage sink:
destination = "storage.googleapis.com/%s" % (bucket.name,)
sink = client.sink(sink_name, filter_=filter, destination=destination)
assert not sink.exists() # API call
sink.create() # API call
assert sink.exists() # API call
Export to BigQuery
To export logs to BigQuery you must log into the Cloud Platform Console
and add cloud-logs@google.com
to a dataset.
See: Setting permissions for BigQuery
from google.cloud.bigquery.dataset import AccessEntry
entry_list = dataset.access_entries
entry_list.append(AccessEntry("WRITER", "groupByEmail", "cloud-logs@google.com"))
dataset.access_entries = entry_list
client.update_dataset(dataset, ["access_entries"]) # API call
Create a BigQuery sink:
destination = "bigquery.googleapis.com%s" % (dataset.path,)
sink = client.sink(sink_name, filter_=filter_str, destination=destination)
assert not sink.exists() # API call
sink.create() # API call
assert sink.exists() # API call
Export to Pub/Sub
To export logs to BigQuery you must log into the Cloud Platform Console
and add cloud-logs@google.com
to a topic.
See: Setting permissions for Pub/Sub
topic_path = client.topic_path(project_id, topic_id)
topic = client.create_topic(request={"name": topic_path})
policy = client.get_iam_policy(request={"resource": topic_path}) # API call
policy.bindings.add(role="roles/owner", members=["group:cloud-logs@google.com"])
client.set_iam_policy(
request={"resource": topic_path, "policy": policy}
) # API call
Create a Cloud Pub/Sub sink:
destination = "pubsub.googleapis.com/%s" % (topic.name,)
sink = client.sink(sink_name, filter_=filter_str, destination=destination)
assert not sink.exists() # API call
sink.create() # API call
assert sink.exists() # API call
Manage Sinks
List all sinks for a project:
for sink in client.list_sinks(): # API call(s)
do_something_with(sink)
Refresh local information about a sink:
existing_sink = client.sink(sink_name)
existing_sink.reload()
Update a sink:
existing_sink.filter_ = updated_filter
existing_sink.update()
Delete a sink:
sink.delete()
Integration with Python logging module
It’s possible to tie the Python logging
module directly into Google
Cloud Logging. There are different handler options to accomplish this.
To automatically pick the default for your current environment, use
get_default_handler()
.
import logging
handler = client.get_default_handler()
cloud_logger = logging.getLogger("cloudLogger")
cloud_logger.setLevel(logging.INFO)
cloud_logger.addHandler(handler)
cloud_logger.error("bad news")
It is also possible to attach the handler to the root Python logger, so that
for example a plain logging.warn
call would be sent to Cloud Logging,
as well as any other loggers created. A helper method
setup_logging()
is provided
to configure this automatically.
client.setup_logging(log_level=logging.INFO)
NOTE: To reduce cost and quota usage, do not enable Cloud Logging handlers while testing locally.
You can also exclude certain loggers:
client.setup_logging(log_level=logging.INFO, excluded_loggers=("werkzeug",))
Cloud Logging Handler
If you prefer not to use
get_default_handler()
, you can
directly create a
CloudLoggingHandler
instance
which will write directly to the API.
from google.cloud.logging.handlers import CloudLoggingHandler
handler = CloudLoggingHandler(client)
cloud_logger = logging.getLogger("cloudLogger")
cloud_logger.setLevel(logging.INFO)
cloud_logger.addHandler(handler)
cloud_logger.error("bad news")
NOTE: This handler by default uses an asynchronous transport that sends log entries on a background thread. However, the API call will still be made in the same process. For other transport options, see the transports section.
All logs will go to a single custom log, which defaults to “python”. The name of the Python logger will be included in the structured log entry under the “python_logger” field. You can change it by providing a name to the handler:
handler = CloudLoggingHandler(client, name="mycustomlog")
Cloud Logging Handler transports
The CloudLoggingHandler
logging handler can use different transports. The default is
BackgroundThreadTransport
.
BackgroundThreadTransport
this is the default. It writes entries on a backgroundpython.threading.Thread
.
SyncTransport
this handler does a direct API call on each logging statement to write the entry.
fluentd logging handlers
Besides CloudLoggingHandler
,
which writes directly to the API, two other handlers are provided.
AppEngineHandler
, which is
recommended when running on the Google App Engine Flexible vanilla runtimes
(i.e. your app.yaml contains runtime: python
), and
ContainerEngineHandler
, which is recommended when running on Google Kubernetes Engine with the
Cloud Logging plugin enabled.
get_default_handler()
and
setup_logging()
will attempt to use
the environment to automatically detect whether the code is running in
these platforms and use the appropriate handler.
In both cases, the fluentd agent is configured to automatically parse log files in an expected format and forward them to Cloud Logging. The handlers provided help set the correct metadata such as log level so that logs can be filtered accordingly.