Provide permissions for log storage in cloud buckets (AWS, Azure, and GCP)
book
Article ID: 262713
calendar_today
Updated On:
Products
Cloud Secure Web Gateway - Cloud SWGEndpoint Security
Issue/Introduction
Broadcom has developed a cloud data forwarding solution to push events to your organization's existing buckets in Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). A future phase of this solution will support Kafka Topics as a delivery mechanism.
This solution will be implemented with multiple Symantec SaaS products to provide a consistent experience. Currently, Cloud Secure Web Gateway (Cloud SWG) has a preview of the Event Streaming feature, which leverages the cloud data forwarding solution. See the Cloud SWG documentation for more information:
To push events to your cloud storage buckets, you must provide the correct permissions. See the Resolution section in this article for configuration steps.
Environment
Supported Symantec cloud service (currently, Cloud SWG)
Existing cloud storage account in AWS, Azure, or GCP
Resolution
Configure accounts and credentials for the cloud storage provider and use the credentials to configure channels in your Symantec SaaS products.
AWS
Integration with AWS requires access keys for authentication.
When generating the access keys, follow the best practices for accessing AWS S3 buckets:
Do not use root user access keys.
Create a user with least privileges that allow the user to perform limited actions on AWS resources.
For the S3 bucket where events are to be uploaded, update the bucket policy as per the Sample S3 bucket permission policy JSON snippet. In the JSON, the action s3:PutObject is required only for seamless event streaming. The Principal defines the AWS resource such as the ARN of a user and the effect such as allowing or denying the user.
To create a user and generate the keys in the AWS console:
From the menu at the top right, select Security credentials.
In the left pane, select Users.
Create a user with the minimal set of privileges that allows the user to perform required actions on AWS resources.
Click the Security credentials tab.
To generate access keys, click Create access key.
Copy the generated Access key ID and Secret access key and use them to configure your event channels for AWS S3 buckets.
In AWS > S3, click the Permissions tab. Edit the bucket policy section and add the policy JSON.
Sample S3 bucket permission policy JSON
{
"Version": "2012-10-17", //Version is restricted field, no update needed here.
"Statement": [
{"Sid": "LimitedAccessS3", //Any Unique Id can be provided.
"Principal": {"AWS":
["arn:aws:iam::123456789011:user/dummy-value" //ARN of the user created for S3 Access ] },
"Effect": "Allow", //Whether above mentioned user should be allowed or denied
"Action": ["s3:PutObject" //List of actions to be provided here
],
"Resource": "arn:aws:s3:::(bucket-name)/*" //Replace (bucket-name) with actual bucket name } ] }
Azure
Integration with Azure requires access keys for authentication.
When generating the access key, use the built-in Storage Blob Data Contributor permission for least privilege.
On the Properties tab, under the Security section, make sure that Storage account access key is enabled for the created account.
In the left pane, under Security + networking, select Access keys.
Copy the Storage account name and the generated Key to configure the channels for the Azure blob storage.
Enter the values that you copied to the respective fields in Add Event Stream dialog into the cloud console:
Key: <storage_account_name>
Secret: <key>
Note: Use an Azure built-in role with least privileges, such as Storage Blob Data Contributor, to generate the access keys.
Select the Compression checkbox to send the event's file compressed as a GZIP. If you do not check this box, the file is stored as a JSON.
In the Query Filter section, search and filter the event type_ids from the Event Type ID list to include the corresponding events in the event stream. Note: If the event type_ids that you want to include are already selected and queried by other streams, you must enable the event stream that you are creating to continue streaming of events with no data loss.
To verify that the connection with the cloud storage account is established, click Test Connection.
Click Create. The created event stream is added to the Event Stream grid view.
GCP
Integration with GCP requires a service account key for authentication.
When generating the service account key, consider least-privilege principles. In GCP, create a principal with the Storage Object Creator role. This role includes read and write access for bucket objects. To provide limited access only, grant the following permissions to the role:
Create a key with the JSON type. This step creates the private key for the service account.
Download the key JSON and use it to configure the event streaming channels for GCP.
Note: The GCP bucket user must have the least privileges. To do so, create a role as Storage Object Creator that has access to the GCP bucket. To provide limited access only, grant the following permissions to the role: