Provide permissions for log storage in cloud buckets (AWS, Azure, and GCP)
search cancel

Provide permissions for log storage in cloud buckets (AWS, Azure, and GCP)

book

Article ID: 262713

calendar_today

Updated On:

Products

Cloud Secure Web Gateway - Cloud SWG Endpoint Security

Issue/Introduction

Broadcom has developed a cloud data forwarding solution to push events to your organization's existing buckets in Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). A future phase of this solution will support Kafka Topics as a delivery mechanism.

This solution will be implemented with multiple Symantec SaaS products to provide a consistent experience. Currently, Cloud Secure Web Gateway (Cloud SWG) has a preview of the Event Streaming feature, which leverages the cloud data forwarding solution. See the Cloud SWG documentation for more information:

https://techdocs.broadcom.com/us/en/symantec-security-software/web-and-network-security/cloud-swg/help/about_reporting_co/event-streaming.html

To push events to your cloud storage buckets, you must provide the correct permissions. See the Resolution section in this article for configuration steps.

Environment

  • Supported Symantec cloud service (currently, Cloud SWG)
  • Existing cloud storage account in AWS, Azure, or GCP

Resolution

Configure accounts and credentials for the cloud storage provider and use the credentials to configure channels in your Symantec SaaS products. 

AWS

Integration with AWS requires access keys for authentication. 

When generating the access keys, follow the best practices for accessing AWS S3 buckets:

  • Do not use root user access keys. 
  • Create a user with least privileges that allow the user to perform limited actions on AWS resources.
  • For the S3 bucket where events are to be uploaded, update the bucket policy as per the Sample S3 bucket permission policy JSON snippet.
    In the JSON, the action s3:PutObject is required only for seamless event streaming. The Principal defines the AWS resource such as the ARN of a user and the effect such as allowing or denying the user.

To create a user and generate the keys in the AWS console:

  1. Login to https://console.aws.amazon.com.
  2. From the menu at the top right, select Security credentials.


  3. In the left pane, select Users.
  4. Create a user with the minimal set of privileges that allows the user to perform required actions on AWS resources.
  5. Click the Security credentials tab.
  6. To generate access keys, click Create access key.
  7. Copy the generated Access key ID and Secret access key and use them to configure your event channels for AWS S3 buckets.
  8. In AWS > S3, click the Permissions tab. Edit the bucket policy section and add the policy JSON.

Sample S3 bucket permission policy JSON

{
"Version": "2012-10-17", //Version is restricted field, no update needed here.
"Statement": [
 {"Sid": "LimitedAccessS3", //Any Unique Id can be provided.
    "Principal": {"AWS":
["arn:aws:iam::123456789011:user/dummy-value" //ARN of the user created for S3 Access
  ]
    },
"Effect": "Allow", //Whether above mentioned user should be allowed or denied
  "Action": ["s3:PutObject" //List of actions to be provided here
                ],
"Resource": "arn:aws:s3:::(bucket-name)/*" //Replace (bucket-name) with actual bucket name
      }
  ]
}

 

Azure

Integration with Azure requires access keys for authentication. 

When generating the access key, use the built-in Storage Blob Data Contributor permission for least privilege. 

To generate the key in the Azure console:

  1. Log in to https://portal.azure.com/.
  2. From All Services, navigate to Storage Account.
  3. On the Properties tab, under the Security section, make sure that Storage account access key is enabled for the created account. 
  4. In the left pane, under Security + networking, select Access keys.
  5. Copy the Storage account name and the generated Key to configure the channels for the Azure blob storage. 
  6. Enter the values that you copied to the respective fields in Add Event Stream dialog into the cloud console:
    • Key: <storage_account_name>
    • Secret: <key>

    Note: Use an Azure built-in role with least privileges, such as Storage Blob Data Contributor, to generate the access keys.

  7. Select the Compression checkbox to send the event's file compressed as a GZIP. If you do not check this box, the file is stored as a JSON.
  8. In the Query Filter section, search and filter the event type_ids from the Event Type ID list to include the corresponding events in the event stream.
    Note: If the event type_ids that you want to include are already selected and queried by other streams, you must enable the event stream that you are creating to continue streaming of events with no data loss.
  9. To verify that the connection with the cloud storage account is established, click Test Connection.
  10. Click Create. The created event stream is added to the Event Stream grid view.

GCP

Integration with GCP requires a service account key for authentication. 

When generating the service account key, consider least-privilege principles. In GCP, create a principal with the Storage Object Creator role. This role includes read and write access for bucket objects. To provide limited access only, grant the following permissions to the role:

  • storage.objects.create
  • storage.objects.delete

To generate keys in the GCP console:

  1. Login to https://console.cloud.google.com.
  2. Go to Service Accounts.
  3. For the service account, select the Keys tab.
  4. Click ADD KEY.
  5. Create a key with the JSON type. This step creates the private key for the service account.
  6. Download the key JSON and use it to configure the event streaming channels for GCP.

Note: The GCP bucket user must have the least privileges. To do so, create a role as Storage Object Creator that has access to the GCP bucket. To provide limited access only, grant the following permissions to the role:

  • storage.objects.create
  • storage.objects.delete