Confluent Cloud
Deploy this integration to set up and run the HTTP Sink connector in a Kafka cluster to send Confluent logs to Logz.io.
Before you begin, you'll need:
- Confluent CLI installed and logged into your Confluent account
- Access to a Kafka cluster
Using Kafka cluster's management interface
Access connectors in cluster
- Go to your Kafka cluster's management interface.
- Select Connectors from the menu.
Select HTTP Sink Connector
Find and select the HTTP Sink connector option.
Select topics to collect data from
Choose the Kafka topic(s) you want to collect data from.
Configure Kafka credentials
- Proceed to the Kafka credentials step.
- Select your preferred authentication method.
- Click Continue.
Set up authentication for Logz.io
In the HTTP URL field, enter:
https://<<LOGZIO-LISTENER-HOST>>:8071/?token=<<LOGZIO-SHIPPING-TOKEN>>&type=<<YOUR-TYPE>>
- Replace
<<YOUR-TYPE>>
with the desired log type. Replace<<LOG-SHIPPING-TOKEN>>
with the token of the account you want to ship to. Replace<<LISTENER-HOST>>
with the host for your region. The required port depends whether HTTP or HTTPS is used: HTTP = 8070, HTTPS = 8071.
- Replace
For Endpoint Authentication type, select None.
Configure the connector
- Choose JSON for Input Kafka record value format.
- Click Show advanced configurations.
- Set HTTP Request Method to POST.
- Select json for Request Body Format.
- Choose false for Batch json as array.
Size the connector
- Decide on the number of tasks for the connector.
- Click Continue.
Review and launch
- Name your connector in the Connector name field.
- Click Continue to launch the connector.
Check Logz.io for your logs
Give your logs some time to get from your system to ours, and then open Open Search Dashboards.
If you still don't see your logs, see log shipping troubleshooting.
Removing the connector (if required)
- Navigate to the connector's interface.
- Go to the settings tab.
- Choose delete connector.
Using Confluent CLI
Log in to Confluent account
Run confluent login
to log in to your Confluent account via the CLI.
Connect to Kafka cluster
Run confluent kafka cluster use <<CLUSTER-ID>>
to connect to your desired Kafka cluster. Replace <<CLUSTER-ID>>
with the ID of your Kafka cluster.
Create configuration file
Open your preferred text editor on your local machine.
Paste the following configuration:
{
"topics": "<<TOPICS>>",
"input.data.format": "JSON",
"connector.class": "HttpSink",
"name": "<<NAME>>",
"kafka.auth.mode": "KAFKA_API_KEY",
"kafka.api.key": "<<KAFKA-API-KEY>>",
"kafka.api.secret": "<<KAFKA-API-SECRET>>",
"http.api.url": "https://<<LOGZIO-LISTENER-HOST>>:8071/?token=<<LOGZIO-SHIPPING-TOKEN>>&type=<<YOUR-TYPE>>",
"request.method": "POST",
...
}
Replace placeholders (<<...>>
) with your specific values:
<<TOPICS>>
: Comma-delimited list of Kafka topics.<<NAME>>
: Name for the connector.<<KAFKA-API-KEY>>
: Your Kafka API key.<<KAFKA-API-SECRET>>
: Kafka API secret matching the provided key.<<YOUR-TYPE>>
: Desired log type.
Replace <<LOG-SHIPPING-TOKEN>>
with the token of the account you want to ship to.
Replace <<LISTENER-HOST>>
with the host for your region. The required port depends whether HTTP or HTTPS is used: HTTP = 8070, HTTPS = 8071.
Save this file as confluent-logzio.json
or a similar JSON file.
Deploy the connector
Run
confluent connect cluster create --config-file <<PATH-TO-YOUR-CONFIG-FILE>>
.Replace
<<PATH-TO-YOUR-CONFIG-FILE>>
with the path to your JSON file.Expect output similar to:
+------+-------------+
| ID | lcc-p12345 |
| Name | logzio-sink |
+------+-------------+
Check Logz.io for your logs
Give your logs some time to get from your system to ours, and then open Open Search Dashboards.
If you still don't see your logs, see log shipping troubleshooting.
Deleting the connector
Run confluent connect cluster delete <<CONNECTOR-ID>>
. You'll be prompted to enter the connector name to confirm deletion.