Skip to main content

AWS Control Tower

note

This integration is currently released as a beta version.

AWS Control Tower is a tool to control a top-level summary of policies applied to the AWS environment. This integration sends logs from S3 buckets that the AWS Control Tower automatically creates in your AWS environment.

Deploy an S3 Hook Lambda function

Create Stack in the relevant region

note

The stacks must be deployed in the same region as the S3 buckets.

To deploy this project, click the button that matches the region you wish to deploy your Stack to:

RegionDeployment
us-east-1Deploy to AWS
us-east-2Deploy to AWS
us-west-1Deploy to AWS
us-west-2Deploy to AWS
eu-central-1Deploy to AWS
eu-north-1Deploy to AWS
eu-west-1Deploy to AWS
eu-west-2Deploy to AWS
eu-west-3Deploy to AWS
sa-east-1Deploy to AWS
ap-northeast-1Deploy to AWS
ap-northeast-2Deploy to AWS
ap-northeast-3Deploy to AWS
ap-south-1Deploy to AWS
ap-southeast-1Deploy to AWS
ap-southeast-2Deploy to AWS
ca-central-1Deploy to AWS

Specify stack details

Specify the stack details as per the table below, check the checkboxes and select Create stack.

ParameterDescriptionRequired/Default
logzioListenerThe Logz.io listener URL for your region. (For more details, see the regions pageRequired
logzioTokenYour Logz.io log shipping token.Required
logLevelLog level for the Lambda function. Can be one of: debug, info, warn, error, fatal, panic.Default: info
logTypeThe log type you'll use with this Lambda. This is shown in your logs under the type field in Kibana. Logz.io applies parsing based on the log type.Default: s3_hook
includePathsRegexesComma-seperated list of regexes that match the paths you'd like to pull logs from. That field is mutually exclusive with the excludePathsRegexes field.-
excludePathsRegexesComma-seperated list of regexes that match the paths that won't pull logs from. That field is mutually exclusive with the includePathsRegexes field.-
pathToFieldsFields from the path to your logs directory that you want to add to the logs. For example, org-id/aws-type/account-id will add each of the fields org-id, aws-type and account-id to the logs that are fetched from the directory that this path refers to.-

Add trigger

Give the stack a few minutes to be deployed.

Once your Lambda function is ready, you'll need to manually add a trigger. This is due to Cloudformation limitations.

Go to the function's page, and click on Add trigger.

Step 5 screenshot

Then, choose S3 as a trigger, and fill in:

  • Bucket: Your bucket name.
  • Event type: Choose option All object create events.
  • Prefix and Suffix should be left empty.

Confirm the checkbox, and click *Add.

Step 5 screenshot

Deploy the Control Tower stack

This stack creates a Lambda function, an EventBridge rule and IAM roles to automatically add triggers to the S3 Hook Lambda function as the Control Tower is creating new buckets.

note

The stacks must be deployed in the same region as the S3 buckets.

To deploy this project, click the button that matches the region you wish to deploy your Stack to:

RegionDeployment
us-east-1Deploy to AWS
us-east-2Deploy to AWS
us-west-1Deploy to AWS
us-west-2Deploy to AWS
eu-central-1Deploy to AWS
eu-north-1Deploy to AWS
eu-west-1Deploy to AWS
eu-west-2Deploy to AWS
eu-west-3Deploy to AWS
sa-east-1Deploy to AWS
ap-northeast-1Deploy to AWS
ap-northeast-2Deploy to AWS
ap-northeast-3Deploy to AWS
ap-south-1Deploy to AWS
ap-southeast-1Deploy to AWS
ap-southeast-2Deploy to AWS
ca-central-1Deploy to AWS

Specify stack details

Specify the stack details as per the table below, check the checkboxes, and select Create stack.

ParameterDescriptionRequired/Default
logLevelLog level for the Lambda function. Can be one of: debug, info, warn, error, fatal, panic.Default: info
s3HookArnThe ARN of your S3 Hook Lambda function.Required
note

It can take a few minutes after the stack creation for EventBridge rule to be triggered.

Important

If want to delete the S3 Hook Stack - you'll need to detach the policy "LambdaAccessBuckets" first.

Check Logz.io for your logs

Give your logs some time to get from your system to ours, and then open Open Search Dashboards.

If you still don't see your logs, see log shipping troubleshooting.

Advanced settings

Automatic parsing

S3 Hook will automatically parse logs in the following cases:

  • The object's path contains the phrase cloudtrail (case insensitive).

Filtering files

If you wish to extract logs from specific paths within the bucket, utilize the includePathsRegexes variable. Conversely, if there exist specific paths within the bucket from which you would prefer not to extract logs, the excludePathsRegexes variable should be employed. These variables are mutually exclusive.

Both variables ought to contain a comma-separated list of regular expressions, corresponding either to the paths from which you aim to extract logs (includePathsRegexes), or to the paths you aim to exclude during log extraction (excludePathsRegexes).

Important

Each time a new object is added to your bucket, this will trigger your Lambda function. Nonetheless, if the key fails to match the regexes, the function will terminate without sending the logs.

Adding object path as logs field

In case you want to use your objects' path as extra fields in your logs, you can do so by using pathToFields.

For example, if your objects are under the path: oi-3rfEFA4/AWSLogs/2378194514/file.log, where oi-3rfEFA4 is org id, AWSLogs is aws type, and 2378194514 is account id.

Setting pathToFields with the value: org-id/aws-type/account-id will add to logs the following fields: org-id: oi-3rfEFA4, aws-type: AWSLogs, account-id: 2378194514.

note

If you use pathToFields, you need to add a value for each subfolder in the path. Otherwise there will be a mismatch and the logs will be sent without fields.

note

This will override a field with the same key, if it exists.

note

In order for the feature to work, you need to set pathToFields from the root of the bucket.