Skip to main content

GCP Operation Suite (Stackdriver)

Logs

Default integration

note

This integration is based on logzio-google-pubsub.

Before you begin, you'll need:

  • Login to your GCP account.

Run Google Cloud Shell configuration

Use the following link, to clone the solution's repo and use it in your Google Cloud Shell:

https://ssh.cloud.google.com/cloudshell/editor?cloudshell_git_repo=https://github.com/logzio/logzio-google-pubsub

You may encounter a pop up window. Check the Trust repo checkbox, and press Confirm.

Run setup script in Google Cloud Shell

Copy the following snippet and paste in your Google Cloud Shell:

./run.sh --listener_url=<<LISTENER-HOST>> --token=<<LOG-SHIPPING-TOKEN>> --gcp_region=<<GCP-REGION>> --log_type=<<LOG-TYPE>> --function_name=<<FUNCTION-NAME>> --telemetry_list=<<TELEMETRY-LIST>>

When you run this script, you should choose the project ID where you need to run the integration.

Replace the variables as per the table below:

ParameterDescription
<<LISTENER-HOST>>Use the listener URL specific to the region of your Logz.io account. You can look it up here.
<<LOG-SHIPPING-TOKEN>>The logs' shipping token of the account you want to ship to.
<<GCP-REGION>>Region where you want to upload Cloud Function. Requires for Deploy to Cloud option for platform.
<<LOG-TYPE>>Log type. Help classify logs into different classifications. (Default: gcp-pubsub)
<<FUNCTION-NAME>>Function name will be using as Google Cloud Function name. (Default: logzioHandler)
<<TELEMETRY-LIST>>Will send logs that match the Google resource type. Detailed list you can find here (ex: pubsub_topic,pubsub_subscription). For all services insert all_services.

For this integration, the telemetry list needs to include gce_operation.

Check Logz.io for your logs

Give your logs some time to get from your system to ours, and then open Open Search Dashboards.

Integration via Google Cloud Pub/Sub

Google Cloud Platform (GCP) Stackdriver collects logs from your cloud services. You can use Google Cloud Pub/Sub to forward your logs from Stackdriver to Logz.io using a continuously runnung Docker container .

Before you begin, you'll need:

Export your logs to Stackdriver

Set up a sink to export your logs to Stackdriver.

For more information, see Exporting with the Logs Viewer from Google Cloud.

Build your credentials file

Create a working directory for this step and cd into it. You'll need to run this command as root:

mkdir /etc/logzio-pubsub && cd /etc/logzio-pubsub

Next, you'll need to build a credentials file so Pub/Sub can authenticate and get the right permissions.

You can build it through:

  • The command line
  • The Cloud console

Option 1: Build the credentials file from the command line

In this step, you'll build your credentials file using your Google Cloud project ID.

Before you begin, you'll need the gcloud command-line tool (CLI) installed. If it isn't, follow the steps to install it:

  1. Download the 'google-cloud-sdk' to '/etc/logzio-pubsub'.
  2. Run source '/etc/logzio-pubsub/google-cloud-sdk/path.bash.inc'. If you're are not already logged in to gcloud, you will be requested to login through your browser.

Run the following command for each project you're working with. Replace the placeholder with your project id before running the command:

wget https://raw.githubusercontent.com/logzio/logzio-pubsub/master/create-credentials.py \
&& python create-credentials.py <<project_id>>

If you rename the file, follow these steps as well.

Option 2: Build the credentials file in the Cloud Console

  • In the GCP Console, go to your project's page. In the left menu, select IAM & admin > Service accounts.

  • At the top of the Service accounts page, click + CREATE SERVICE ACCOUNT.

  • Give a descriptive Service account name, such as "credentials file". Click CREATE to continue to the Service account permissions page.

  • Add the role: 'Pub/Sub Editor'.

  • Click CONTINUE to Grant users access to this service account. Click ADD KEY + CREATE NEW KEY to open the Create key panel. Select JSON and click CREATE to save the private key to your machine.

  • Click DONE to return to the Service accounts page.

  • Rename it in the following format: <project-id>-credentials.json - replace to your project id. Move it to the /etc/logzio-pubsub folder you've created at the beginning of this step.

Variation

  • If your credentials file name isn't of the default format <<project_id>>-credentials.json, follow the steps below as well.

Build your Pub/Sub input YAML file

Create a file 'pubsub-input.yml' to hold your Pub/Sub input configuration. To create the file run the following command as root. Then open the file in your text editor:

touch /etc/logzio-pubsub/pubsub-input.yml

Paste this code block into your file. Complete configuration instructions are below the code block. 👇

listener: <<LISTENER-HOST>>
pubsubs:
- project_id: PROJECT-1_ID
topic_id: TOPIC-1_ID
token: <<LOG-SHIPPING-TOKEN>>
credentials_file: ./credentials-file.json
subscriptions: [SUB1_ID, SUB2_ID, SUB3_ID]
type: stackdriver

- project_id: PROJECT-1_ID
topic_id: TOPIC-2_ID
token: <<LOG-SHIPPING-TOKEN>>
credentials_file: ./credentials-file.json
subscriptions: [SUB1_ID, SUB2_ID, SUB3_ID]
type: stackdriver

- project_id: PROJECT-3_ID
topic_id: TOPIC-1_ID
token: <<LOG-SHIPPING-TOKEN>>
credentials_file: ./credentials-file.json
subscriptions: [SUB1_ID, SUB2_ID, SUB3_ID]
type: stackdriver

** Note that YAML files are sensitive to spaces and tabs. We recommend using a YAML validator to make sure that the file structure is correct.

Click here for more information about filebeat for Google Cloud Pub/Sub.

Configuration instructions

ParameterDescription
listenerThe Logz.io listener host. {% include log-shipping/listener-var.html %}
pubsubsThis is an array of one or more GCP subscriptions. For each subscription, provide topic and subscription IDs, as given from Pub/Sub.
tokenYour Logz.io shipping token. For each project under pubsubs. Replace <<LOG-SHIPPING-TOKEN>> with the token of the account you want to ship to. You can send your logs to different accounts that are in the same region, you can do that by inserting a different token per project.
credentials_file (Not required, Default value: <project_id>-credentials.json)This field is only required if your credentials file is named differently than the default value. For an example of adding this field go to input example file.

Pull the Docker image

Download the logzio/logzio-pubsub image:

docker pull logzio/logzio-pubsub

Run the container

Run the following command after you replace <<PROJECT_ID>> with your details.

docker run --name logzio-pubsub \
-v /etc/logzio-pubsub/pubsub-input.yml:/logzio-pubsub/pubsub-input.yml \
-v /etc/logzio-pubsub/<<PROJECT_ID>>-credentials.json:/logzio-pubsub/<<PROJECT_ID>>-credentials.json \
logzio/logzio-pubsub

Variations

  • If you're working with multiple topics, add this line for every credentials file you've created. Insert your project id instead of the parameters:

    -v /etc/logzio-pubsub/<<PROJECT_ID>>-credentials.json:/logzio-pubsub/<<PROJECT_ID>>-credentials-file.json \
  • If your credentials file name isn't of the default format <<project_id>>-credentials.json, follow the steps below as well.

  • If you're using a Mac, you'll need to fix issues with mounting files from root directory. Add the path '/etc/logzio-pubsub' to your Docker File Sharing. Click here for a guide on how to fix this issue - you can use docker desktop or manually edit your Docker configuration file. For more information about mounting files from the root directory click here.

Check Logz.io for your logs

Spin up your Docker containers if you haven’t done so already. Give your logs some time to get from your system to ours, and then open Open Search Dashboards.

If you've renamed the credentials file

The default naming convention for the credentials file is: <<project_id>>-credentials.json.

When you create the credentials file through the command line, it is automatically named as per the default.

If you create the credentials file using the GCP Console, you'll have the option to select the file name. We strongly recommend that you stick to the default format: <<project_id>>-credentials.json.

If you decide to give the credentials file another name, please follow these instructions:

  1. On step 3 - building your 'pubsub-input.yml' file, add the field 'credentials_file' with your credentials file's name as the value.

    Go to the github project to see an example of an input file.

  2. On step 5 - running the docker, add the following line for every credentials file you've created:

    -v /etc/logzio-pubsub/<<credentials-file-name>>.json:/logzio-pubsub/<<credentials-file-name>>.json \.

    Replace <<credentials-file-name>> with your credentials file's name.

Integration via Filebeat

Google Workspace is a collection of cloud computing, productivity and collaboration tools, software and products developed and marketed by Google. You can ship Google Workspace logs to Logz.io using Filebeat and Google Reports API.

Before you begin, you'll need: Filebeat installed.

note

The GSuite module was deprecated as of Filebeat 7.12 and has been replaced with the Google Workspace module, to align with Google's current naming. The integration remains the same, requiring only that you replace "- module: gsuite" with "- module: google_workspace" in the modules block.

Google Workspace setup

Set up a Service Account

Follow the official Google Workspace tutorial for setting up a service account through IAM.

Grant access to the Admin SDK API

Enable access to the following APIs and services. If you can't find the API, specify the API name in APIs & Services > Library search box.

  • Admin SDK
  • People API (If you're using a Google Workspace Migrate version earlier than 2.4.2.0, use the Contacts API instead.)
  • Google Workspace Migrate API
  • Gmail API
  • Google Calendar API
  • Google Drive API
  • Groups Migration API
  • Groups Settings API
  • Google Sheets API
  • Tasks API

Delegate domain-wide authority to your service account

Open your Google Workspace domain’s Admin console. Next, navigate to Main menu > Security > API controls.

In the Domain-wide delegation pane, select Manage Domain Wide Delegation.

note

If you can't find the Manage Domain Wide Delegation option, you will need to switch to a super-admin Google Workspace account.

Once you access the Manage Domain Wide Delegation, click Add new, and fill in the details:

  • Client ID - Enter the service account's Client ID - you can find it in the service account's details under Unique ID. It is also found in the client_id field of the credentials file that was auto-downloaded when you created a new key for your service account.
  • OAuth Scopes - Enter the admin's API
  • Click Authorize to confirm your changes.

Filebeat monitoring setup

Download the Logz.io public certificate to your credentials server

For HTTPS shipping, download the Logz.io public certificate to your certificate authority folder.

sudo curl https://raw.githubusercontent.com/logzio/public-certificates/master/AAACertificateServices.crt --create-dirs -o /etc/pki/tls/certs/COMODORSADomainValidationSecureServerCA.crt
Configure Filebeat

Open the Filebeat configuration file (the default path /etc/filebeat/filebeat.yml) with your preferred text editor. Copy and paste the code block below, overwriting the previous contents.

note

Filebeat requires a file extension specified for the log input.

### Filebeat


### General
fields:
logzio_codec: json
token: <<LOG-SHIPPING-TOKEN>>
# Replace <<LOG-SHIPPING-TOKEN>> with the token of the account you want to ship to.


type: google_workspace
fields_under_root: true
encoding: utf-8
ignore_older: 3h

### Modules
filebeat.modules:
- module: google_workspace
saml:

# Replace <<PATH_TO_CREDENTIALS_FILE>> with the path to the file. See examples below.
# Replace <<DELEGATED_ACCOUNT_EMAIL>> with the email address of the Admin (or superadmin) that authorized the domain wide delegation function.

enabled: true
var.jwt_file: "<<PATH_TO_CERDNTIALS_FILE>>"
var.delegated_account: "<<DELEGATED_ACCOUNT_EMAIL>>"
user_accounts:
enabled: true
var.jwt_file: "<<PATH_TO_CERDNTIALS_FILE>>"
var.delegated_account: "<<DELEGATED_ACCOUNT_EMAIL>>"
login:
enabled: true
var.jwt_file: "<<PATH_TO_CERDNTIALS_FILE>>"
var.delegated_account: "<<DELEGATED_ACCOUNT_EMAIL>>"
admin:
enabled: true
var.jwt_file: "<<PATH_TO_CERDNTIALS_FILE>>"
var.delegated_account: "<<DELEGATED_ACCOUNT_EMAIL>>"
drive:
enabled: true
var.jwt_file: "<<PATH_TO_CERDNTIALS_FILE>>"
var.delegated_account: "<<DELEGATED_ACCOUNT_EMAIL>>"
groups:
enabled: true
var.jwt_file: "<<PATH_TO_CERDNTIALS_FILE>>"
var.delegated_account: "<<DELEGATED_ACCOUNT_EMAIL>>"

### Input

### Registry
filebeat.registry.path: /var/lib/filebeat

### Processors
# The following processors are to ensure compatibility with version 7
processors:
- if:
has_fields: ['gsuite']
then:
- rename:
fields:
- from: "source"
to: "gsuite_source"
- rename:
fields:
- from: "agent"
to: "filebeat_agent"
ignore_missing: true
- rename:
fields:
- from: "log.file.path"
to: "source"
ignore_missing: true
- add_id: ~

### Output
output.logstash:
hosts: ["<<LISTENER-HOST>>:5015"]
# Replace <<LISTENER-HOST>> with the host for your region. For example, listener.logz.io if your account is hosted on AWS US East, or listener-nl.logz.io if hosted on Azure West Europe. The required port depends whether HTTP or HTTPS is used: HTTP = 8070, HTTPS = 8071.


ssl:
certificate_authorities: ['/etc/pki/tls/certs/COMODORSADomainValidationSecureServerCA.crt']

For a full list of available Filebeat configuration options for the Google Workspace module, please see Filebeat's documentation.

Still in the same configuration file, replace the placeholders to match your specifics.

Download the Logz.io public certificate to your credentials server

For HTTPS shipping, download the Logz.io public certificate to your certificate authority folder.

sudo curl https://raw.githubusercontent.com/logzio/public-certificates/master/AAACertificateServices.crt --create-dirs -o /etc/pki/tls/certs/COMODORSADomainValidationSecureServerCA.crt

Replace <<LOG-SHIPPING-TOKEN>> with the token of the account you want to ship to.

Replace <<LISTENER-HOST>> with the host for your region. For example, listener.logz.io if your account is hosted on AWS US East, or listener-nl.logz.io if hosted on Azure West Europe. The required port depends whether HTTP or HTTPS is used: HTTP = 8070, HTTPS = 8071.

  • Replace <<PATH_TO_CREDENTIALS_FILE>> with the path to the file (for example ./credentials_file.json with credentials of the service account path that was created on the GCP. It is preferable to use the full path for the file.

  • Replace <<DELEGATED_ACCOUNT_EMAIL>> with the email address of the Admin (in most cases superadmin) that authorized the domain wide delegation function to the service account (GCP) on the Google Workspace account.

Start Filebeat

Start or restart Filebeat for the changes to take effect.

Check Logz.io for your logs

Give your logs some time to get from your system to ours, and then open Open Search Dashboards.

If you still don't see your logs, see Filebeat troubleshooting.

Metrics

note

This integration is based on logzio-google-metrics.

Before you begin, you'll need:

  • Login to your GCP account.

Run Google Cloud Shell configuration

Use the following link, to clone the solution's repo and use it in your Google Cloud Shell:

https://ssh.cloud.google.com/cloudshell/editor?cloudshell_git_repo=https://github.com/logzio/logzio-google-metrics

You may encounter a pop up window. Check the Trust repo checkbox, and press Confirm.

Run setup script in Google Cloud Shell

Copy the following snippet and paste in your Google Cloud Shell:

./run.sh --listener_url=<<LISTENER-HOST>> --token=<<PROMETHEUS-METRICS-SHIPPING-TOKEN>> --gcp_region=<<GCP-REGION>> --function_name=<<FUNCTION-NAME-PREFIX>> --telemetry_list=<<TELEMETRY-LIST>>

When you run this script, you should choose the project ID where you need to run the integration.

Replace the variables as per the table below:

ParameterDescription
<<LISTENER-HOST>>Use the listener URL specific to the region of your Logz.io account. You can look it up here.
<<PROMETHEUS-METRICS-SHIPPING-TOKEN>>The metrics' shipping token of the account you want to ship to.
<<GCP-REGION>>Region where you want to upload Cloud Function. Requires for Deploy to Cloud option for platform.
<<FUNCTION-NAME-PREFIX>>Function name will be using as Google Cloud Function name. (Default: metrics_gcp)
<<TELEMETRY-LIST>>Will send metrics that match the Google metric type. Detailed list you can find here (ex: cloudfunctions.googleapis.com)

Check Logz.io for your metrics

Give your data some time to get from your system to ours, then log in to your Logz.io Metrics account, and open the Logz.io Metrics tab.