GCP Datastream
Google Datastream is a serverless and easy-to-use change data capture (CDC) and replication service.
Logs
This integration is based on logzio-google-pubsub
.
Before you begin, you'll need:
- Login to your GCP account.
Run Google Cloud Shell configuration
Click this link to clone the solution's repo and use it in your Google Cloud Shell.
If a pop-up window appears, check the Trust repo
box and press Confirm
.
Run setup script in Google Cloud Shell
Copy the following snippet and paste in your Google Cloud Shell:
./run.sh --listener_url=<<LISTENER-HOST>> --token=<<LOG-SHIPPING-TOKEN>> --gcp_region=<<GCP-REGION>> --log_type=<<LOG-TYPE>> --function_name=<<FUNCTION-NAME>> --telemetry_list=<<TELEMETRY-LIST>>
When you run this script, you should choose the project ID where you need to run the integration.
Replace the variables as per the table below:
Parameter | Description |
---|---|
<<LISTENER-HOST>> | Use the listener URL specific to the region of your Logz.io account. You can look it up here. |
<<LOG-SHIPPING-TOKEN>> | The logs' shipping token of the account you want to ship to. |
<<GCP-REGION>> | Region where you want to upload Cloud Function. Requires for Deploy to Cloud option for platform. |
<<LOG-TYPE>> | Log type. Help classify logs into different classifications. (Default: gcp-pubsub ) |
<<FUNCTION-NAME>> | Function name will be using as Google Cloud Function name. (Default: logzioHandler ) |
<<TELEMETRY-LIST>> | Will send logs that match the Google resource type. Detailed list you can find here (ex: pubsub_topic,pubsub_subscription ). For all services insert all_services . |
Updating telemetry_list
after creation
To update the resources that are monitored by the function follow the steps:
- Go to Log router page.
- Choose
logzioHandler-sink-logs-to-logzio
. - Edit the sink.
- Update the query which filters for the resource types to monitor.
For this integration, the telemetry list needs to include datastream
.
Check Logz.io for your logs
Give your logs some time to get from your system to ours, and then open Open Search Dashboards.
Metrics
This integration is based on logzio-google-metrics
.
Before you begin, you'll need:
- Login to your GCP account.
Run Google Cloud Shell configuration
Click this link to clone the solution's repo and use it in your Google Cloud Shell.
You may encounter a pop up window. Check the Trust repo
checkbox, and press Confirm
.
Run setup script in Google Cloud Shell
Copy the following snippet and paste in your Google Cloud Shell:
./run.sh --listener_url=<<LISTENER-HOST>> --token=<<PROMETHEUS-METRICS-SHIPPING-TOKEN>> --gcp_region=<<GCP-REGION>> --function_name=<<FUNCTION-NAME-PREFIX>> --telemetry_list=<<TELEMETRY-LIST>>
When you run this script, you should choose the project ID where you need to run the integration.
Replace the variables as per the table below:
Parameter | Description |
---|---|
<<LISTENER-HOST>> | Use the listener URL specific to the region of your Logz.io account. You can look it up here. |
<<PROMETHEUS-METRICS-SHIPPING-TOKEN>> | The metrics' shipping token of the account you want to ship to. |
<<GCP-REGION>> | Region where you want to upload Cloud Function. Requires for Deploy to Cloud option for platform. |
<<FUNCTION-NAME-PREFIX>> | Function name will be using as Google Cloud Function name. (Default: metrics_gcp ) |
<<TELEMETRY-LIST>> | Will send metrics that match the Google metric type. Detailed list you can find here (ex: cloudfunctions.googleapis.com ) |
For this integration, the telemetry list needs to include datastream.googleapis.com
.
Check Logz.io for your metrics
Give your data some time to get from your system to ours, then log in to your Logz.io Metrics account, and open the Logz.io Metrics tab.