Manage source endpoints in Transfer

Create a source endpoint

Note

Transfer uses custom-built connectors for the following sources: ClickHouse®, PostgreSQL, MySQL, MongoDB, Amazon S3.

Other connectors are based on Airbyte .

  1. Go to the Transfer service page .

  2. Select the Endpoints tab, click Create endpointSource.

  3. Select the Source type from which you want to transfer the data.

  4. Under Basic settings:

    • Enter the Name of the endpoint.

    • (optional) Enter the Description of the endpoint.

  5. Specify endpoint parameters under Endpoint settings:

  6. Click Submit.

You can create a source endpoint using the Transfer endpoint resource of the DoubleCloud Terraform provider.

Example provider and resource configuration:

# main.tf

terraform {
  required_providers {
    doublecloud = {
      source    = "registry.terraform.io/doublecloud/doublecloud"
    }
  }
}

provider "doublecloud" {
  authorized_key = file("authorized_key.json")
}

resource "doublecloud_transfer_endpoint" "example-source-endpoint" {
  name       = "example-source-endpoint"
  project_id = DOUBLECLOUD_PROJECT_ID     # Replace with your project ID
  settings {
    postgres_source {
      connection {
        on_premise {
          hosts = [
            "mypostgresql.host"
          ]
          port = 5432
        }
      }
      database = "postgres"
      user     = "postgres"
      password = POSTGRESQL_PASSWORD     # Replace with the PostgreSQL password
    }
  }
}

To learn how to get the authorized_key.json file, refer to Create an API key. You can find the DoubleCloud project ID on the project settings page.

Tip

This example contains an example PostgreSQL source endpoint configuration. For configuration parameters for other source types, refer to the Transfer endpoint resource schema .

To create a source endpoint, use the EndpointService create method and pass the following parameters:

  • project_id - The ID of the project in which you want to create your endpoint. You can get this value on your project's information page.

  • name - the new endpoint name. Must be unique within the project.

  • settings - the relevant endpoint settings according to your endpoint type containing _source as reflected in the doublecloud.transfer.v1.EndpointSettings model.

This example shows how to create an Amazon S3 source endpoint.

import argparse
import json
import logging

import doublecloud

from doublecloud.transfer.v1.endpoint.airbyte.s3_source_pb2 import S3Source
from doublecloud.transfer.v1.endpoint_service_pb2 import CreateEndpointRequest
from doublecloud.transfer.v1.endpoint_service_pb2_grpc import EndpointServiceStub

def create_s3_src_endpoint(sdk, project_id, name):
   svc = sdk.client(EndpointServiceStub)
   operation = svc.Create(
      CreateEndpointRequest(
         project_id=project_id,
         name=f"s3-src-{name}",
         settings=EndpointSettings(
            s3_source=S3Source(
               dataset="test",
               path_pattern="test",
               schema="test",
               format=S3Source.Format(csv=S3Source.Csv()),
               provider=S3Source.Provider(bucket="test"),
             )
         ),
     )
   )
   return operation

For more in-depth examples, check out DoubleCloud API Python SDK repository .

func createS3SourceEndpoint(ctx context.Context, dc *dc.SDK, flags *cmdFlags) (*operation.Operation, error) {
   op, err := dc.WrapOperation(dc.Transfer().Endpoint().Create(ctx, &transfer.CreateEndpointRequest{
      ProjectId: *flags.projectID,
      Name:      fmt.Sprint("s3-src-", *flags.name),
      Settings: &transfer.EndpointSettings{
         Settings: &transfer.EndpointSettings_S3Source{
            S3Source: &endpoint_airbyte.S3Source{
               Dataset:     "test",
               PathPattern: "test",
               Schema:      "test",
               Format:      &endpoint_airbyte.S3Source_Format{Format: &endpoint_airbyte.S3Source_Format_Csv{}},
               Provider:    &endpoint_airbyte.S3Source_Provider{Bucket: "test"},
            },
         },
      },
   }))
   if err != nil {
      return op, err
   }
   err = op.Wait(ctx)
   return op, err
}

For more in-depth examples, check out DoubleCloud API Go SDK repository .

Test a source endpoint

After configuring your endpoint, click Test. You'll see an endpoint test dialog:

endpoint-test

You can use two runtime types for connection testing - Dedicated and Serverless.

Runtime compatibility warning

Don't use endpoints with different runtime types in the same transfer - this will cause it to fail.

Dedicated

The Transfer service uses this runtime to connect to your data source via an internal or external network.

This runtime is useful when you need to use a specific network - it may be an external network or an internal one with a peer connection.

To run the test with this runtime:

  1. Under Runtime, select Dedicated.

  2. From the drop-down list, select the network to use to connect to your data source.

  3. Click Test connection.

    After completing the test procedure, you'll see a list of the endpoint's data sources. For Apache Kafka® endpoints, you'll also see data samples for each data source.

Serverless

The Transfer service uses this runtime to connect to your data sources available from the internet via an automatically defined network.

Use this runtime to test an endpoint to a data source located outside isolated networks.

To run the test with this runtime:

  1. Under Runtime, select Serverless.

  2. Click Test.

    After completing the test procedure, you'll see a list of the endpoint's data sources. For Apache Kafka® endpoints, you'll also see data samples for each data source.

Warning

Please be patient - testing may take up to several minutes.

Update a source endpoint

  1. Go to the Transfer page in the console and open the Endpoints tab.

  2. Select the endpoint to update.

  3. Click Edit at the top-right of the page.

  4. Edit the Basic parameters and the Endpoint parameters. You can't change the Source type at this point.

  5. Click Submit.

Warning

You can't change the source type. If you want to connect to a different type database, create a new source endpoint.

resource "doublecloud_transfer_endpoint" "example-source-endpoint" {
  name = "example-source-endpoint"
  project_id = DOUBLECLOUD_PROJECT_ID     # Replace with your project ID
  settings {
    postgres_source {
      ...                                 # Update the desired parameters
    }
  }
}

After you finish editing the configuration, update the endpoint resource using the terraform apply command.

To update a source endpoint, use the EndpointService update method and pass the following parameters:

  • endpoint_id - the ID of the endpoint you want to update. To find the endpoint ID, get a list of endpoints in the project.

  • name - the new endpoint name. Must be unique within the project.

  • settings - the relevant endpoint settings according to your endpoint type as reflected in the doublecloud.transfer.v1.EndpointSettings model.

Delete a source endpoint

Warning

Before you delete an endpoint, delete all the transfers that use it.

To delete an endpoint:

  1. Go to the Transfer page in the console and open the Endpoints tab.

  2. Select the endpoint to delete.

  3. At the top-right of the page, click Delete.

  4. In the opened window, confirm your action and click Delete.

To delete an endpoint:

  1. Comment out or delete the endpoint resource from the Terraform configuration files.

  2. Apply the configuration:

    terraform apply
    

Use the EndpointService delete method and pass the endpoint ID in the endpoint_id request parameter.

To find the endpoint ID, get a list of endpoints in the project.

github-mark-white

View this example on GitHub

from doublecloud.transfer.v1.endpoint_service_pb2 import DeleteEndpointRequest


def delete_endpoint(svc, endpoint_id: str):
   return svc.Delete(DeleteEndpointRequest(endpoint_id=endpoint_id))

For more in-depth examples, check out DoubleCloud API Python SDK repository .

func createS3SourceEndpoint(ctx context.Context, dc *dc.SDK, flags *cmdFlags) (*operation.Operation, error) {
   op, err := dc.WrapOperation(dc.Transfer().Endpoint().Create(ctx, &transfer.CreateEndpointRequest{
      ProjectId: *flags.projectID,
      Name:      fmt.Sprint("s3-src-", *flags.name),
      Settings: &transfer.EndpointSettings{
         Settings: &transfer.EndpointSettings_S3Source{
            S3Source: &endpoint_airbyte.S3Source{
               Dataset:     "test",
               PathPattern: "test",
               Schema:      "test",
               Format:      &endpoint_airbyte.S3Source_Format{Format: &endpoint_airbyte.S3Source_Format_Csv{}},
               Provider:    &endpoint_airbyte.S3Source_Provider{Bucket: "test"},
            },
         },
      },
   }))
   if err != nil {
      return op, err
   }
   err = op.Wait(ctx)
   return op, err
}

For more in-depth examples, check out DoubleCloud API Go SDK repository .