By Scott Ellis and Subhasish Chakraborty, GCP Product Management
This post is the second in a new “taking charge of your security” series, providing advice and best practices for ensuring security in the cloud. Check out the first post in the series, “Help keep your Google Cloud service account keys safe.”
Cloud storage is well-suited to many use cases, from serving data, to data analytics, to data archiving. Here at Google Cloud, we work hard to make Google Cloud Storage the best and safest repository for your sensitive data: For example, we run on a hardened backend infrastructure, monitor our infrastructure for threats and automatically encrypt customer data at rest.
Nevertheless, as more organizations use various public cloud storage platforms, we hear increasingly frequent reports of sensitive data being inadvertently exposed. It’s important to note that these “breaches” are often the result of misconfigurations that inadvertently grant access to more users than was intended. The good news is that with the right tools and processes in place, you can help protect your data from unintended exposure.
Security in the cloud is a shared responsibility, and as a Cloud Storage user, we’re here to help you with some tips on how to set up appropriate access controls, locate sensitive data and do your part to help keep data more secure with tools included in Google Cloud Platform (GCP).
Check for appropriate permissions
The first step to securing a Cloud Storage bucket is to make sure that only the right individuals or groups have access. By default, access to Cloud Storage buckets is restricted, but owners and admins often make the buckets or objects public. While there are legitimate reasons to do this, making buckets public can open avenues for unintended exposure of data, and should be approached with caution.
The preferred method for controlling access to buckets and objects is to use Identity and Access Management (IAM) permissions. IAM allows you to implement fine-grained access control to your storage buckets right out of the gate. Learn how to manage access to Cloud Storage buckets with this how-to guide. Just be sure that you understand what permissions you are granting to which users or groups. For example, granting access to a group that contains a large number of users can create significant unintended exposure. You can also use Cloud Resource Manager to centrally manage and control your projects and resources.
Check for sensitive data
Even if you’ve set the appropriate permissions, it’s important to know if there’s sensitive data stored in a Cloud Storage bucket. Enter Cloud Data Loss Prevention (DLP) API. The DLP API uses more than 40 predefined detectors to quickly and scalably classify sensitive data elements such as payment card numbers, names, personal identification numbers, telephone numbers and more. Here’s a how-to guide that teaches you how to inspect your GCS buckets using DLP API.
If you find sensitive data in buckets that are shared too broadly, you should take appropriate steps to resolve this quickly. You can:
- Make the public buckets or objects private again
- Restrict access to the bucket (see Using IAM)
- Remove the sensitive file or object from the bucket
- Use the Cloud DLP API to redact sensitive content
You should also avoid naming storage buckets which may contain sensitive data in a way that reveals their contents.
Protecting sensitive data is not a one-time exercise. Permissions change, new data is added and new buckets can crop up without the right permissions in place. As a best practice, set up a regular schedule to check for inappropriate permissions, scan for sensitive data and take the appropriate follow-up actions.
Tools like IAM and DLP help make it easy to secure your data in the cloud. Watch this space for more ways to prevent unintended access, automate data protection and protect other GCP datastores and assets.
Feed Source: Google Cloud Platform Blog
Article Source: 4 steps for hardening your Cloud Storage buckets: taking charge of your security