AWS DynamoDB
Logs
This service integration is specifically designed to work with the destination bucket to which the service writes its logs.
It is based on the service's naming convention and path structure.
If you're looking to ship the service's logs from a different bucket, please use the S3 Bucket shipping method instead.
Before you begin:
If you plan on using an access key to authenticate your connection, you'll need to set the
s3:ListBucket
ands3:GetObject
permissions for the required S3 bucket.If you plan on using an IAM role to authenticate your connection, you can get the role policy by filling out the bucket information and clicking the "Get the role policy" button.
File names in ascending alphanumeric order. This is important because the S3 fetcher's offset is determined by the name of the last file fetched. We recommend using standard AWS naming conventions to determine the file name ordering and to avoid log duplication.
Send your logs to an S3 bucket
Logz.io fetches your CloudTrail logs from an S3 bucket.
For help with setting up a new trail, see Overview for Creating a Trail from AWS.
Verify bucket definition on AWS
Navigate to the location of your trail logs on AWS:
And verify the definition of the bucket is under the CloudTrail path:
Region data must be created under the CloudTrain path BEFORE the S3 bucket is defined on Logz.io. Otherwise, you won't be able to proceed with sending CloudTrail data to Logz.io.
:::
Next, note the bucket's name and the way the prefix is constructed, for example:
Bucket name: aws-cloudtrail-logs-486140753397-9f0d7dbd
.
Prefix name: AWSLogs/486140753397/CloudTrail/
.
You'll need these values when adding your S3 bucket information.
Add your S3 bucket information
To use the S3 fetcher, log into your Logz.io account, and go to the CloudTrail log shipping page.
- Click + Add a bucket
- Select your preferred method of authentication - an IAM role or access keys.
The configuration wizard will open.
- Provide the S3 bucket name
- Provide your Prefix. That is your CloudTrail path. See further details below.
- There is no Region selection box because it is not needed. Logz.io will pull data from all regions in AWS for the specified bucket and account.
- Choose whether you want to include the source file path. This saves the path of the file as a field in your log.
- Save your information.
Getting the information from your CloudTrail AWS path
You may need to fill in 2 parameters when creating the bucket - {BUCKET_NAME} and {PREFIX}. You can find them in your CloudTrail AWS path. The AWS path structure for CloudTrail looks like the examle below:
{BUCKET_NAME}/{PREFIX_IF_EXISTS}/cloudtrail/AWSLogs/{AWS_ACCOUNT_ID}/CloudTrail/
{BUCKET_NAME} is your S3 bucket name.
{PREFIX} is your CloudTrail path. The prefix is generate by default and represents the complete path inside the bucket up until the regions section. It should look like this:
AWSLogs/{AWS_ACCOUNT_ID}/CloudTrail/
.
Logz.io fetches logs that are generated after configuring an S3 bucket. Logz.io cannot fetch past logs retroactively.
Check Logz.io for your logs
Give your logs some time to get from your system to ours, and then open Open Search Dashboards.
If you still don't see your logs, see log shipping troubleshooting.
Troubleshooting
Problem: Failed to save bucket configuration
The following error appears when you're trying to create a bucket:
AWS failed to create cloudtrail bucket. Exception AWS bucket is empty: 403.
Possible cause
The bucket's location is incorrect or might be missing the correct prefix.
Suggested remedy
- Head to CloudTrail console on AWS and check the relevant trail:
- Verify that the location of the trail is correct:
And verify that the prefix contains all parts:
In this case, the cause of the error is that the location is empty or that the prefix is wrong.
The bucket should be aws-cloudtrail-logs-486140753397-9f0d7dbd
, and the prefix should be AWSLogs/486140753397/CloudTrail/
. You can click on the prefix to verify that it is empty.
Once you fix these issues, you can return to Logz.io to create the CloudTrail bucket.
Metrics
For a much easier and more efficient way to collect and send metrics, consider using the Logz.io telemetry collector.
Deploy this integration to send your Amazon DynamoDB metrics to Logz.io.
This integration creates a Kinesis Data Firehose delivery stream that links to your Amazon DynamoDB metrics stream and then sends the metrics to your Logz.io account. It also creates a Lambda function that adds AWS namespaces to the metric stream, and a Lambda function that collects and ships the resources' tags.
Log in to your Logz.io account and navigate to the current instructions page inside the Logz.io app. Install the pre-built dashboard to enhance the observability of your metrics.
To view the metrics on the main dashboard, log in to your Logz.io Metrics account, and open the Logz.io Metrics tab.
Before you begin, you'll need:
- An active account with Logz.io
Configure AWS to forward metrics to Logz.io
Create Stack in the relevant region
To deploy this project, click the button that matches the region you wish to deploy your Stack to:
Specify stack details
Specify the stack details as per the table below, check the checkboxes and select Create stack.
Parameter | Description | Required/Default |
---|---|---|
logzioListener | The Logz.io listener URL for your region. (For more details, see the regions page. For example - https://listener.logz.io:8053 | Required |
logzioToken | Your Logz.io metrics shipping token. | Required |
awsNamespaces | Comma-separated list of the AWS namespaces you want to monitor. See this list of namespaces. If you want to automatically add all namespaces, use value all-namespaces . | Required |
logzioDestination | Your Logz.io destination URL. | Required |
httpEndpointDestinationIntervalInSeconds | The length of time, in seconds, that Kinesis Data Firehose buffers incoming data before delivering it to the destination. | 60 |
httpEndpointDestinationSizeInMBs | The size of the buffer, in MBs, that Kinesis Data Firehose uses for incoming data before delivering it to the destination. | 5 |
Check Logz.io for your metrics
Give your data some time to get from your system to ours, then log in to your Logz.io Metrics account, and open the Logz.io Metrics tab.
Log in to your Logz.io account and navigate to the current instructions page inside the Logz.io app. Install the pre-built dashboard to enhance the observability of your metrics.
To view the metrics on the main dashboard, log in to your Logz.io Metrics account, and open the Logz.io Metrics tab.