Export logs from DoubleCloud services
In addition to accessing service logs in the console, you can configure DoubleCloud to export them to external log management providers, such as Datadog or an S3-compatible storage solution.
With log export, you can:
-
Centrally collect and process logs from DoubleCloud and other services you use.
-
Integrate logs from DoubleCloud with the rest of your observability infrastructure.
-
Provide standardized logs to your teams.
-
Store docs for longer than the retention period specified in your cluster or transfer configuration.
Configure log export
-
Go to the project's Log export
-
Click Add destination.
-
In the dialog window, enter the name of your log export.
-
Select the destination and enter the configuration details.
Datadog
-
API key: Datadog API key
-
Datadog site: URL of the Datadog site. Make sure to select the correct URL as Datadog sites are independent and data isn't shared across them by default.
Amazon S3 or S3-compatible endpoint
DoubleCloud can export logs to both Amazon S3 and other S3-compatible storage solutions.
-
Bucket: Name of the S3 bucket to export logs to.
-
Folder name: Folder where logs will be exported. Can include the date as a template variable in the Go date format, such as
2006/01/02/some_folder
. -
Access key ID.
-
Secret access key.
-
Region: Region where the bucket is located.
-
Endpoint: Endpoint of the S3-compatible service. Leave blank if you're using AWS.
-
Allow connections without SSL: Select if you're connecting to an S3-compatible service that doesn't use SSL/TLS.
-
Skip verifying SSL certificate: Select if the bucket allows self-signed certificates.
Coralogix
-
Domain: Coralogix domain. Make sure to specify the correct site
-
Token: Coralogix Send-Your-Data API key
Datadog-compatible endpoint
-
API key: API key of the Datadog-compatible service.
-
Endpoint: Endpoint of the Datadog-compatible service.
-
-
Click Save.
-
In the new dialog window, select sources—the clusters and transfers whose logs you want to export.
You can skip this step and select sources later.
Tip
If you're looking to export logs from one DoubleCloud project to several destinations, configure export to Datadog or S3 and then forward logs to several destinations from there.
You can create a log export using the
DoubleCloud Terraform provider
Tip
If you haven't used Terraform before, refer to Create DoubleCloud resources with Terraform for more detailed instructions.
Example provider and resource configuration:
# main.tf
terraform {
required_providers {
doublecloud = {
source = "registry.terraform.io/doublecloud/doublecloud"
}
}
}
provider "doublecloud" {
authorized_key = file("authorized_key.json")
}
resource "doublecloud_log_export" "example-log-export" {
project_id = DOUBLECLOUD_PROJECT_ID # Replace with your project ID
name = "Example log export"
sources {
id = CLICKHOUSE_CLUSTER_ID # Replace with your source cluster ID
type = "clickhouse"
}
datadog {
api_key = DATADOG_API_KEY # Replace with your Datadog API key
datadog_host = "datadoghq.com"
}
}
To learn how to get the authorized_key.json
file,
refer to Create an API key.
You can find the DoubleCloud project ID on the project settings page.
Edit log sources
To select what sources to export logs from, take the following steps:
-
Go to the project's Log export
-
Next to the destination name, click
-
Select the clusters and transfers whose logs you want to export.
-
Click Save.
Remove a source
To remove a log source, complete the following steps:
-
Go to the project's Log export
-
Find the source you want to remove and click
next to it. -
In the dialog window, confirm removing the source by clicking Yes, remove.
Tip
If you want to add this source again at a later time, take the steps described in Edit log sources.
Remove a destination
To remove a destination, take the following steps:
-
Go to the project's Log export
-
Next to the destination name, click
-
In the dialog window, confirm removing the destination by clicking Yes, remove.