Il est possible qu'une partie ou l'ensemble des informations de cette page ne s'appliquent pas au Cloud de confiance S3NS.
Interroger et afficher les entrées de journal
Ce document explique comment interroger, afficher et analyser les entrées de journal à l'aide de la page Explorateur de journaux de la console Trusted Cloud .
Vous pouvez interroger les données de vos journaux et enregistrer vos requêtes en exécutant des commandes de l'API Logging.
Vous pouvez également interroger vos journaux à l'aide de la Google Cloud CLI.
La page de l'explorateur de journaux est conçue pour vous aider à résoudre les problèmes et à analyser les performances de vos services et applications. Par exemple, un histogramme affiche le taux d'erreurs. Si vous constatez un pic d'erreurs ou un événement intéressant, vous pouvez localiser et afficher les entrées de journal correspondantes.
Le même langage de requête est compatible avec l'API Cloud Logging, Google Cloud CLI et l'explorateur de journaux. Pour simplifier la création de requêtes, vous pouvez créer des requêtes à l'aide de menus, en saisissant du texte et, dans certains cas, en utilisant les options incluses dans l'affichage d'une entrée de journal individuelle.
La page "Explorateur de journaux" ne prend pas en charge les opérations d'agrégation, comme le décompte du nombre d'entrées de journal contenant un modèle spécifique.
Étapes suivantes
Sauf indication contraire, le contenu de cette page est régi par une licence Creative Commons Attribution 4.0, et les échantillons de code sont régis par une licence Apache 2.0. Pour en savoir plus, consultez les Règles du site Google Developers. Java est une marque déposée d'Oracle et/ou de ses sociétés affiliées.
Dernière mise à jour le 2025/08/11 (UTC).
[[["Facile à comprendre","easyToUnderstand","thumb-up"],["J'ai pu résoudre mon problème","solvedMyProblem","thumb-up"],["Autre","otherUp","thumb-up"]],[["Il n'y a pas l'information dont j'ai besoin","missingTheInformationINeed","thumb-down"],["Trop compliqué/Trop d'étapes","tooComplicatedTooManySteps","thumb-down"],["Obsolète","outOfDate","thumb-down"],["Problème de traduction","translationIssue","thumb-down"],["Mauvais exemple/Erreur de code","samplesCodeIssue","thumb-down"],["Autre","otherDown","thumb-down"]],["Dernière mise à jour le 2025/08/11 (UTC)."],[],[],null,["# Query and view log entries\n\nThis document describes how you query, view, and analyze log entries by using\nthe Google Cloud console. There are two interfaces available to you, the\nLogs Explorer and Log Analytics. You can query, view, and analyze\nlogs with both interfaces; however, they use different query languages and they\nhave different capabilities.\nFor troubleshooting and exploration of log data, we recommend using the\nLogs Explorer. To generate insights and trends, we recommend that you\nuse Log Analytics.\nYou can query your logs and save your queries by issuing\n[Logging API](/logging/docs/reference/v2/rest/v2/entries/list) commands.\nYou can also query your logs by using\n[Google Cloud CLI](/logging/docs/api/gcloud-logging#reading_log_entries).\n\nLogs Explorer\n-------------\n\nThe Logs Explorer is designed to help you troubleshoot and analyze the\nperformance of your services and applications. For example, a histogram\ndisplays the rate of errors. If you see a spike in errors or something that\nis interesting, you can locate and view the\ncorresponding log entries. When a log entry is associated with an\n[error group](/error-reporting/docs/grouping-errors), the log entry is\nannotated with a\nmenu of options that let you access more information about the error group.\n\nThe same [query language](/logging/docs/view/logging-query-language) is\nsupported by the Cloud Logging API, the Google Cloud CLI,\nand the Logs Explorer.\nTo simplify query construction when you are using the Logs Explorer, you can\n[build queries](/logging/docs/view/building-queries) by using menus, by\nentering text, and, in some cases, by using options included with the display\nof an individual log entry.\n\nThe Logs Explorer doesn't support aggregate operations,\nlike counting the number of log entries that contain a specific pattern.\nTo perform aggregate operations, enable analytics on the log bucket and then use\nLog Analytics.\n\nFor details about searching and viewing logs with the Logs Explorer, see\n[View logs by using the Logs Explorer](/logging/docs/view/logs-explorer-interface).\n\nLog Analytics\n-------------\n\nUsing Log Analytics, you can run queries that analyze your log data, and\nthen you can view or [chart the query results](/logging/docs/analyze/charts). Charts let\nyou identify patterns and trends in your logs over time. The following\nscreenshot illustrates the charting capabilities in Log Analytics:\n\nFor example, suppose that you are troubleshooting a problem and you want to\nknow the average latency for HTTP requests issued to a specific URL over time.\nWhen a log bucket is upgraded to use Log Analytics, you can write a\n[SQL](/bigquery/docs/reference/standard-sql/query-syntax) query or use the query builder to query logs stored in your log\nbucket.\n\nThese SQL queries can also include [pipe syntax](/bigquery/docs/pipe-syntax-guide).\nBy grouping and aggregating your logs, you can gain insights into your log\ndata which can help you reduce time spent troubleshooting.\n\nLog Analytics lets you query\n[log views](/logging/docs/logs-views) or an\n[analytics view](/logging/docs/analyze/about-analytics-views). Log views have a fixed schema which\ncorresponds to the [`LogEntry`](/logging/docs/reference/v2/rest/v2/LogEntry) data structure.\nBecause the creator of an analytics view determines the schema, one use\ncase for analytics views is to transform log data from the\n`LogEntry` format into a format that is more suitable for you.\n\nYou can also use [BigQuery](/bigquery/docs/introduction)\nto query your data. For example, suppose that you want to use\nBigQuery to compare URLs in your logs with a public dataset of\nknown malicious URLs. To make your log data visible to\nBigQuery, upgrade your bucket to use Log Analytics and then\n[create a linked dataset](/logging/docs/buckets#link-bq-dataset).\n\nYou can continue to troubleshoot issues and view individual log entries in\nupgraded log buckets by using the Logs Explorer.\n\n### Restrictions\n\n- To upgrade an existing log bucket to use Log Analytics, the following\n restrictions apply:\n\n - The log bucket was created at the Google Cloud project level.\n - The log bucket is [unlocked](/logging/docs/buckets#locking-logs-buckets) unless it is the `_Required` bucket.\n - There aren't pending updates to the bucket.\n- Log entries written before a bucket is upgraded aren't immediately available.\n However, when the backfill operation completes, you can analyze these log\n entries. The backfill process might take several hours.\n\n- You can't use the **Log Analytics** page to query log views when the log bucket\n has [field-level access controls](/logging/docs/field-level-acl) configured.\n However, you can issue queries\n through the **Logs Explorer** page, and you can query a\n [linked BigQuery dataset](/logging/docs/buckets#link-bq-dataset).\n Because BigQuery doesn't honor field-level access controls, if you\n query a linked dataset, then you can query all fields in the log entries.\n\n- If you query multiple log buckets that are configured with different\n Cloud KMS keys, then the query fails unless the following\n constraints are met:\n\n - The log buckets are in the same location.\n - A folder or organization that is a parent resource of the log buckets is [configured with a default key](/logging/docs/routing/managed-encryption).\n - The default key is in the same location as the log buckets.\n\n When the previous constraints are satisfied, the parent's Cloud KMS\n key encrypts any temporary data generated by a Log Analytics query.\n- Duplicate log entries aren't removed before a query is run. This behavior\n is different than when you query log entries by using the Logs Explorer,\n which removes duplicate entries by comparing the log names, timestamps, and\n insert ID fields. For more information, see\n [Troubleshoot: There are duplicate log entries in my Log Analytics results](/logging/docs/analyze/troubleshoot#duplicate-analytics).\n\n| **Note:** If your data is managed through an [Assured Workloads environment](/assured-workloads/docs/key-concepts), then this feature might be impacted or restricted. For information, see [Restrictions and limitations in Assured Workloads](/assured-workloads/docs/eu-sovereign-controls-restrictions-limitations#features_logging).\n\nPricing\n-------\n\nCloud Logging doesn't charge to route logs to a\nsupported destination; however, the destination might apply charges.\nWith the exception of the `_Required` log bucket,\nCloud Logging charges to stream logs into log buckets and\nfor storage longer than the default retention period of the log bucket.\n\nCloud Logging doesn't charge for copying logs,\nfor creating [log scopes](/logging/docs/log-scope/create-and-manage)\nor [analytics views](/logging/docs/analyze/about-analytics-views),\nor for queries issued through the\n**Logs Explorer** or **Log Analytics** pages.\n\nFor more information, see the following documents:\n\n- The Cloud Logging sections of the [Google Cloud Observability pricing](https://cloud.google.com/stackdriver/pricing) page.\n- Costs when routing log data to other Google Cloud services:\n\n - [Cloud Storage pricing](https://cloud.google.com/storage/pricing)\n - [BigQuery pricing](https://cloud.google.com/bigquery/pricing#data_ingestion_pricing)\n - [Pub/Sub pricing](https://cloud.google.com/pubsub/pricing)\n- [VPC flow log generation charges](https://cloud.google.com/vpc/network-pricing#network-telemetry) apply when you send and then exclude your Virtual Private Cloud flow logs from Cloud Logging.\n\nThere are no BigQuery ingestion or storage costs when\nyou upgrade a bucket to use Log Analytics and then\ncreate a [linked dataset](/bigquery/docs/analytics-hub-introduction#linked_datasets).\nWhen you create a linked dataset for a log bucket, you don't ingest your\nlog data into BigQuery. Instead, you get read access\nto the log data stored in your log bucket through the linked dataset.\n\nBigQuery analysis charges apply when you run SQL queries on\nBigQuery linked datasets, which includes using the\n**BigQuery Studio** page, the BigQuery API, and the\nBigQuery command-line tool.\n\nBlogs\n-----\n\nFor more information about Log Analytics, see the following blog posts:\n\n- For an overview of Log Analytics, see [Log Analytics in Cloud Logging is now GA](/blog/products/devops-sre/log-analytics-in-cloud-logging-is-now-ga).\n- To learn about creating charts generated by Log Analytics queries and saving those charts to custom dashboards, see [Announcing Log Analytics charts and dashboards in Cloud Logging in\n public preview](/blog/products/management-tools/new-log-analytics-charts-and-dashboards-in-cloud-logging).\n- To learn about analyzing audit logs by using Log Analytics, see [Gleaning security insights from audit logs with Log Analytics](/blog/products/identity-security/gleaning-security-insights-from-audit-logs-with-log-analytics).\n- If you route logs to BigQuery and want to understand the difference between that solution and using Log Analytics, then see [Moving to Log Analytics for BigQuery export users](/blog/products/data-analytics/moving-to-log-analytics-for-bigquery-export-users).\n\nWhat's next\n-----------\n\n- [Create a log bucket and upgrade it to use Log Analytics](/logging/docs/buckets#create_bucket)\n- [Upgrade an existing bucket to use Log Analytics](/logging/docs/buckets#upgrade-bucket)\n- Query and view logs:\n\n - [Log Analytics: Query and analyze logs](/logging/docs/analyze/query-and-view)\n - [Logs Explorer: Query and view logs](/logging/docs/view/logs-explorer-interface)\n- Sample queries:\n\n - [Log Analytics: SQL examples](/logging/docs/analyze/examples)\n - [Logs Explorer: Logging query language examples](/logging/docs/view/query-library)"]]