Cloud Logging API overview
The Cloud Logging API lets you programmatically accomplish logging-related tasks,
including reading and writing log entries, and
managing sinks to route logs.
See the following reference documentation for the Logging API:
For details on the limits that apply to your usage of the Logging API,
see Logging API quotas and limits.
Enable the Logging API
The Logging API must be enabled before it can be used. For
instructions, see
Enable the Logging API.
Access the Logging API
You can indirectly invoke the Logging API by using a
command-line interface or a client library written to support a
high-level programming language. For more information, see
the following reference documentation:
- To learn how to set up client libraries and authorize the
Logging API, with sample code, see Client libraries.
Optimize usage of the Logging API
Following are some tips for using the Logging API effectively.
Read and list logs efficiently
To efficiently use your entries.list
quota, try the
following:
Set a large pageSize
: In the request body, you can set the pageSize
parameter up to and including the maximum value of an int32
(2,147,483,647).
Setting the pageSize
parameter to a higher value lets Logging
return more entries per query, reducing the number of queries needed to
retrieve the full set of entries that you're targeting.
Set a large deadline: When a query nears its deadline, Logging
prematurely terminates and returns the log entries scanned thus far. If you
set a large deadline, then Logging can retrieve more entries
per query.
Retry quota errors with exponential backoff:
If your use case isn't time-sensitive, then you can wait for the quota to
replenish before retrying
your query. The pageToken
parameter is still valid after a delay.
Write logs efficiently
To efficiently use your entries.write
quota, increase your
batch volume to support a larger number of log entries per
request, which helps reduce the number of writes made per request.
Logging supports requests with up to 10MB of data.
Bulk retrieval of log entries
The method you use to retrieve log entries is
entries.list
, but this method isn't intended for
high-volume retrieval of log entries. Using this method in this way might
quickly exhaust your quota for read requests.
If you need contemporary or continuous querying, or bulk retrieval of log
entries, then
configure sinks to send your
log entries to
Pub/Sub. When you create a Pub/Sub sink, you send the log
entries that you want to process to a Pub/Sub topic, and then
consume the log entries from there.
This approach has the following advantages:
- It doesn't exhaust your read-request quota. For more on quotas, see
Logging usage limits.
- It captures log entries that might have been written out of order, without
workarounds to seek back and re-read recent entries to ensure nothing was
missed.
- It automatically buffers the log entries if the consumer becomes
unavailable.
- The log entries don't count towards your free allotment because they
aren't stored in log buckets.
You can create Pub/Sub sinks to route log entries to a variety of
analytics platforms.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-08-28 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-28 UTC."],[],[],null,["# Cloud Logging API overview\n\nThe Cloud Logging API lets you programmatically accomplish logging-related tasks,\nincluding reading and writing log entries, creating log-based metrics, and\nmanaging sinks to route logs.\n\nSee the following reference documentation for the Logging API:\n\n- For the REST version of the API, see [REST reference](/logging/docs/reference/v2/rest).\n- For the gRPC version of the API, see [gRPC reference](/logging/docs/reference/v2/rpc).\n\nFor details on the limits that apply to your usage of the Logging API,\nsee [Logging API quotas and limits](/logging/quotas#api-limits).\n\nEnable the Logging API\n----------------------\n\nThe Logging API must be enabled before it can be used. For\ninstructions, see\n[Enable the Logging API](/logging/docs/api/enable-api).\n\nAccess the Logging API\n----------------------\n\nYou can indirectly invoke the Logging API by using a\ncommand-line interface or a client library written to support a\nhigh-level programming language. For more information, see\nthe following reference documentation:\n\n- For the command-line interface to the Logging API, see the [`gcloud logging` command](/logging/docs/reference/tools/gcloud-logging).\n\n\u003c!-- --\u003e\n\n- To learn how to set up client libraries and authorize the Logging API, with sample code, see [Client libraries](/logging/docs/reference/libraries).\n\n\u003c!-- --\u003e\n\n- To try the API without writing any code, you can use the APIs Explorer. The APIs Explorer appears on REST API method reference pages in a panel titled **Try this API** . For instructions, see [Using the API Explorer](/logging/docs/api).\n\nOptimize usage of the Logging API\n---------------------------------\n\nFollowing are some tips for using the Logging API effectively.\n\n### Read and list logs efficiently\n\nTo efficiently use your [`entries.list`](/logging/docs/reference/v2/rest/v2/entries/list) quota, try the\nfollowing:\n\n- Set a large `pageSize`: In the request body, you can set the `pageSize`\n parameter up to and including the maximum value of an `int32` (2,147,483,647).\n Setting the `pageSize` parameter to a higher value lets Logging\n return more entries per query, reducing the number of queries needed to\n retrieve the full set of entries that you're targeting.\n\n- Set a large deadline: When a query nears its deadline, Logging\n prematurely terminates and returns the log entries scanned thus far. If you\n set a large deadline, then Logging can retrieve more entries\n per query.\n\n- Retry quota errors with [exponential backoff](/monitoring/api/troubleshooting#retry):\n If your use case isn't time-sensitive, then you can wait for the quota to\n replenish before retrying\n your query. The `pageToken` parameter is still valid after a delay.\n\n### Write logs efficiently\n\nTo efficiently use your [`entries.write`](/logging/docs/reference/v2/rest/v2/entries/write) quota, increase your\nbatch volume to support a larger number of log entries per\nrequest, which helps reduce the number of writes made per request.\nLogging supports requests with up to 10MB of data.\n\n### Bulk retrieval of log entries\n\nThe method you use to retrieve log entries is\n[`entries.list`](/logging/docs/reference/v2/rest/v2/entries/list), but this method isn't intended for\nhigh-volume retrieval of log entries. Using this method in this way might\nquickly exhaust your quota for read requests.\n\nIf you need contemporary or continuous querying, or bulk retrieval of log\nentries, then\n[configure sinks](/logging/docs/export/configure_export_v2) to send your\nlog entries to\nPub/Sub. When you create a Pub/Sub sink, you send the log\nentries that you want to process to a Pub/Sub topic, and then\nconsume the log entries from there.\n\nThis approach has the following advantages:\n\n- It doesn't exhaust your read-request quota. For more on quotas, see [Logging usage limits](/logging/quotas#log-limits).\n- It captures log entries that might have been written out of order, without workarounds to seek back and re-read recent entries to ensure nothing was missed.\n- It automatically buffers the log entries if the consumer becomes unavailable.\n- The log entries don't count towards your free allotment because they aren't stored in log buckets.\n\nYou can create Pub/Sub sinks to route log entries to a variety of\nanalytics platforms. For an example, see\n[Scenarios for routing Cloud Logging data: Splunk](/solutions/exporting-stackdriver-logging-for-splunk)."]]