Using the API#

Authentication and Configuration#

  • For an overview of authentication in google-cloud-python, see Authentication.

  • In addition to any authentication configuration, you should also set the GOOGLE_CLOUD_PROJECT environment variable for the project you’d like to interact with. If you are Google App Engine or Google Compute Engine this will be detected automatically.

  • The library now enables the gRPC transport for the logging API by default, assuming that the required dependencies are installed and importable. To disable this transport, set the GOOGLE_CLOUD_DISABLE_GRPC environment variable to a non-empty string, e.g.: $ export GOOGLE_CLOUD_DISABLE_GRPC=true.

  • After configuring your environment, create a Client

    from google.cloud import logging
    client = logging.Client()
    

    or pass in credentials and project explicitly

    from google.cloud import logging
    client = logging.Client(project='my-project', credentials=credentials)
    

Writing log entries#

To write log entries, first create a Logger, passing the “log name” with which to associate the entries:

logger = client.logger(LOG_NAME)

Write a simple text entry to the logger.

logger.log_text("A simple entry")  # API call

Write a dictionary entry to the logger.

logger.log_struct({
    'message': 'My second entry',
    'weather': 'partly cloudy',
})  # API call

Retrieving log entries#

Fetch entries for the default project.

for entry in client.list_entries():  # API call(s)
    do_something_with(entry)

Fetch entries across multiple projects.

PROJECT_IDS = ['one-project', 'another-project']
for entry in client.list_entries(project_ids=PROJECT_IDS):  # API call(s)
    do_something_with(entry)

Filter entries retrieved using the Advanced Logs Filters syntax

Fetch entries for the default project.

FILTER = 'logName:log_name AND textPayload:simple'
for entry in client.list_entries(filter_=FILTER):  # API call(s)
    do_something_with(entry)

Sort entries in descending timestamp order.

from google.cloud.logging import DESCENDING
for entry in client.list_entries(order_by=DESCENDING):  # API call(s)
    do_something_with(entry)

Retrieve entries in batches of 10, iterating until done.

iterator = client.list_entries()
pages = iterator.pages

page1 = next(pages)  # API call
for entry in page1:
    do_something_with(entry)

page2 = next(pages)  # API call
for entry in page2:
    do_something_with(entry)

Retrieve entries for a single logger, sorting in descending timestamp order:

from google.cloud.logging import DESCENDING
for entry in logger.list_entries(order_by=DESCENDING):  # API call(s)
    do_something_with(entry)

Delete all entries for a logger#

logger.delete()  # API call

Manage log metrics#

Metrics are counters of entries which match a given filter. They can be used within Stackdriver Monitoring to create charts and alerts.

List all metrics for a project:

for metric in client.list_metrics():  # API call(s)
    do_something_with(metric)

Create a metric:

metric = client.metric(
    METRIC_NAME, filter_=FILTER, description=DESCRIPTION)
assert not metric.exists()  # API call
metric.create()  # API call
assert metric.exists()  # API call

Refresh local information about a metric:

existing_metric = client.metric(METRIC_NAME)
existing_metric.reload()  # API call

Update a metric:

existing_metric.filter_ = UPDATED_FILTER
existing_metric.description = UPDATED_DESCRIPTION
existing_metric.update()  # API call

Delete a metric:

    metric.delete()

Export log entries using sinks#

Sinks allow exporting entries which match a given filter to Cloud Storage buckets, BigQuery datasets, or Cloud Pub/Sub topics.

Export to Cloud Storage#

Make sure that the storage bucket you want to export logs too has cloud-logs@google.com as the owner. See Setting permissions for Cloud Storage.

Add cloud-logs@google.com as the owner of the bucket:

bucket.acl.reload()  # API call
logs_group = bucket.acl.group('cloud-logs@google.com')
logs_group.grant_owner()
bucket.acl.add_entity(logs_group)
bucket.acl.save()  # API call

Create a Cloud Storage sink:

DESTINATION = 'storage.googleapis.com/%s' % (bucket.name,)
sink = client.sink(SINK_NAME, filter_=FILTER, destination=DESTINATION)
assert not sink.exists()  # API call
sink.create()  # API call
assert sink.exists()  # API call

Export to BigQuery#

To export logs to BigQuery you must log into the Cloud Platform Console and add cloud-logs@google.com to a dataset.

See: Setting permissions for BigQuery

from google.cloud.bigquery.dataset import AccessGrant
grants = dataset.access_grants
grants.append(AccessGrant(
    'WRITER', 'groupByEmail', 'cloud-logs@google.com'))
dataset.access_grants = grants
dataset.update()  # API call

Create a BigQuery sink:

DESTINATION = 'bigquery.googleapis.com%s' % (dataset.path,)
sink = client.sink(SINK_NAME, filter_=FILTER, destination=DESTINATION)
assert not sink.exists()  # API call
sink.create()  # API call
assert sink.exists()  # API call

Export to Pub/Sub#

To export logs to BigQuery you must log into the Cloud Platform Console and add cloud-logs@google.com to a topic.

See: Setting permissions for Pub/Sub

policy = topic.get_iam_policy()  # API call
policy.owners.add(policy.group('cloud-logs@google.com'))
topic.set_iam_policy(policy)  # API call

Create a Cloud Pub/Sub sink:

DESTINATION = 'pubsub.googleapis.com/%s' % (topic.full_name,)
sink = client.sink(SINK_NAME, filter_=FILTER, destination=DESTINATION)
assert not sink.exists()  # API call
sink.create()  # API call
assert sink.exists()  # API call

Manage Sinks#

List all sinks for a project:

for sink in client.list_sinks():  # API call(s)
    do_something_with(sink)

Refresh local information about a sink:

existing_sink = client.sink(SINK_NAME)
existing_sink.reload()

Update a sink:

existing_sink.filter_ = UPDATED_FILTER
existing_sink.update()

Delete a sink:

sink.delete()

Integration with Python logging module#

It’s possible to tie the Python logging module directly into Google Stackdriver Logging. There are different handler options to accomplish this. To automatically pick the default for your current environment, use get_default_handler().

import logging
handler = client.get_default_handler()
cloud_logger = logging.getLogger('cloudLogger')
cloud_logger.setLevel(logging.INFO)
cloud_logger.addHandler(handler)
cloud_logger.error('bad news')

It is also possible to attach the handler to the root Python logger, so that for example a plain logging.warn call would be sent to Stackdriver Logging, as well as any other loggers created. A helper method setup_logging() is provided to configure this automatically.

client.setup_logging(log_level=logging.INFO)

Note

To reduce cost and quota usage, do not enable Stackdriver logging handlers while testing locally.

You can also exclude certain loggers:

client.setup_logging(log_level=logging.INFO,
                     excluded_loggers=('werkzeug',))

Cloud Logging Handler#

If you prefer not to use get_default_handler(), you can directly create a CloudLoggingHandler instance which will write directly to the API.

from google.cloud.logging.handlers import CloudLoggingHandler
handler = CloudLoggingHandler(client)
cloud_logger = logging.getLogger('cloudLogger')
cloud_logger.setLevel(logging.INFO)
cloud_logger.addHandler(handler)
cloud_logger.error('bad news')

Note

This handler by default uses an asynchronous transport that sends log entries on a background thread. However, the API call will still be made in the same process. For other transport options, see the transports section.

All logs will go to a single custom log, which defaults to “python”. The name of the Python logger will be included in the structured log entry under the “python_logger” field. You can change it by providing a name to the handler:

handler = CloudLoggingHandler(client, name='mycustomlog')

fluentd logging handlers#

Besides CloudLoggingHandler, which writes directly to the API, two other handlers are provided. AppEngineHandler, which is recommended when running on the Google App Engine Flexible vanilla runtimes (i.e. your app.yaml contains runtime: python), and ContainerEngineHandler , which is recommended when running on Google Container Engine with the Stackdriver Logging plugin enabled.

get_default_handler() and setup_logging() will attempt to use the environment to automatically detect whether the code is running in these platforms and use the appropriate handler.

In both cases, the fluentd agent is configured to automatically parse log files in an expected format and forward them to Stackdriver logging. The handlers provided help set the correct metadata such as log level so that logs can be filtered accordingly.

Cloud Logging Handler transports#

The CloudLoggingHandler logging handler can use different transports. The default is BackgroundThreadTransport.

  1. BackgroundThreadTransport this is the default. It writes entries on a background python.threading.Thread.
  1. SyncTransport this handler does a direct API call on each logging statement to write the entry.