Stream logs in real-time to Google BigQuery, so you can analyze log data and get immediate insights.
Export logs to Google Cloud Storage (including Nearline), so you can archive logs data for longer periods to meet backup and compliance requirements.
Today we’re expanding Cloud Logging capabilities with the beta of Cloud Logging Connector that allows you to stream logs to Google Cloud Pub/Sub. With this capability you can stream log data to your own endpoints and further expand how you can make big data useful. For example, you can now transform and enrich the data in Cloud Dataflow before sending it to BigQuery for analysis. Furthermore, this provides easy real-time access to all your logs data, so you can export it to your private cloud or any third party application.
Google Cloud Pub/Sub delivers real-time and reliable messaging in one global, managed service that helps you create simpler, more reliable, and more flexible applications. By providing many-to-many, asynchronous messaging that decouples senders and receivers, it allows for secure and highly available communication between independently written applications. With Cloud Pub/Sub, you can push your log events to another Webhook, or pull them as they happen. For more information, check out our Google Cloud Pub/Sub documentation.
High-Level Pub/Sub Schema
Configuring Export to Cloud Pub/Sub
Configuring export of logs to Cloud Pub/Sub is easy and can be done from the Logs Viewer user interface. To get to the export configuration UI start in the Developers Console, go to Logs under Monitoring and then click Exports on the top menu. Currently this supports export configuration for Google App Engine and Google Compute Engine logs.
One Click Export Configuration in the Developers Console
Transforming Log Data in Dataflow
Google Cloud Dataflow allows you to build, deploy, and run data processing pipelines at any scale. It enables reliable execution for large-scale data processing scenarios such as ETL and analytics, and allows pipelines to execute in either streaming or batch mode. You choose.
You can use the Cloud Pub/Sub export mechanism to stream your log data to Cloud Dataflow and dynamically generate fields, combine different log tables for correlation, and parse and enrich the data for custom needs. Here are a few examples of what you can achieve with log data in Cloud Dataflow:
Sometimes it is useful to see the data only for the key applications for top customers. In Cloud Dataflow, you can group logs by Customer ID or Application ID, filter out specific logs, and then apply some aggregation of system level or application level metrics.
On the flip side, sometimes you want to enrich the log data to make it easier to analyze, for example by appending marketing campaign information to customer interaction logs, or other user profile info. Cloud Dataflow lets you do this on the fly.
In addition to preparing the data for further analysis, Cloud Dataflow also lets you perform analysis in real time. So you can look for anomalies, detect security intrusions, generate alerts, keep a real-time dashboard updated, etc.
Cloud Dataflow can stream the processed data to BigQuery, so you can analyze your enriched data. For more details, please see the Google Cloud Dataflow documentation.
If you’re a current Google Cloud Platform user, the capability to stream logs to Cloud Pub/Sub is available to you at no additional charge. Applicable charges for using Cloud Pub/Sub and Cloud Dataflow will apply. For more information, visit the the Cloud Logging documentation page and share your feedback.
-Posted by Deepak Tiwari, Product Manager
Feed Source: Google Cloud Platform Blog
Article Source: Take your logs data to new places with streaming export to Cloud Pub/Sub