Create and manage source endpoints in Transfer

Create a source endpoint

Note

Transfer uses custom-built connectors for the following sources: ClickHouse®, PostgreSQL, MySQL, MongoDB, and Amazon S3.

Other connectors are based on Airbyte .

  1. Go to the Transfer page in the console.

  2. Click CreateSource endpoint.

  3. Under Source type, select the type of your data source.

  4. Under Basic settings, enter an endpoint name and (optionally) a description.

  5. Under Endpoint settings, specify the endpoint parameters that correspond to the endpoint type:

  6. Click Submit.

You can create a source endpoint using the Transfer endpoint resource of the DoubleCloud Terraform provider.

Example provider and resource configuration:

# main.tf

terraform {
  required_providers {
    doublecloud = {
      source    = "registry.terraform.io/doublecloud/doublecloud"
    }
  }
}

provider "doublecloud" {
  authorized_key = file("authorized_key.json")
}

resource "doublecloud_transfer_endpoint" "example-source-endpoint" {
  name       = "example-source-endpoint"
  project_id = DOUBLECLOUD_PROJECT_ID     # Replace with your project ID
  settings {
    postgres_source {
      connection {
        on_premise {
          hosts = [
            "mypostgresql.host"
          ]
          port = 5432
        }
      }
      database = "postgres"
      user     = "postgres"
      password = POSTGRESQL_PASSWORD     # Replace with the PostgreSQL password
    }
  }
}

To learn how to get the authorized_key.json file, refer to Create an API key. You can find the DoubleCloud project ID on the project settings page.

Tip

This example contains an example PostgreSQL source endpoint configuration. For configuration parameters for other source types, refer to the Transfer endpoint resource schema .

To create a source endpoint, use the EndpointService create method and pass the following parameters:

  • project_id: ID of the project where you want to create an endpoint. You can get the project ID on the project's information page.

  • name: Endpoint name. Must be unique within the project.

  • settings: Settings corresponding to the endpoint type and containing _source, as reflected in the doublecloud.transfer.v1.EndpointSettings model.

This example shows how to create an Amazon S3 source endpoint.

import argparse
import json
import logging

import doublecloud

from doublecloud.transfer.v1.endpoint.airbyte.s3_source_pb2 import S3Source
from doublecloud.transfer.v1.endpoint_service_pb2 import CreateEndpointRequest
from doublecloud.transfer.v1.endpoint_service_pb2_grpc import EndpointServiceStub

def create_s3_src_endpoint(sdk, project_id, name):
   svc = sdk.client(EndpointServiceStub)
   operation = svc.Create(
      CreateEndpointRequest(
         project_id=project_id,
         name=f"s3-src-{name}",
         settings=EndpointSettings(
            s3_source=S3Source(
               dataset="test",
               path_pattern="test",
               schema="test",
               format=S3Source.Format(csv=S3Source.Csv()),
               provider=S3Source.Provider(bucket="test"),
             )
         ),
     )
   )
   return operation

For more in-depth examples, check out the DoubleCloud API Python SDK repository .

func createS3SourceEndpoint(ctx context.Context, dc *dc.SDK, flags *cmdFlags) (*operation.Operation, error) {
   op, err := dc.WrapOperation(dc.Transfer().Endpoint().Create(ctx, &transfer.CreateEndpointRequest{
      ProjectId: *flags.projectID,
      Name:      fmt.Sprint("s3-src-", *flags.name),
      Settings: &transfer.EndpointSettings{
         Settings: &transfer.EndpointSettings_S3Source{
            S3Source: &endpoint_airbyte.S3Source{
               Dataset:     "test",
               PathPattern: "test",
               Schema:      "test",
               Format:      &endpoint_airbyte.S3Source_Format{Format: &endpoint_airbyte.S3Source_Format_Csv{}},
               Provider:    &endpoint_airbyte.S3Source_Provider{Bucket: "test"},
            },
         },
      },
   }))
   if err != nil {
      return op, err
   }
   err = op.Wait(ctx)
   return op, err
}

For more in-depth examples, check out DoubleCloud API Go SDK repository .

Test a source endpoint

After configuring your endpoint, click Test connection. You'll see the following dialog:

Screenshot of the endpoint testing dialog

You can use two runtime types for connection testing - Dedicated and Serverless.

Runtime compatibility warning

Don't use endpoints with different runtime types in the same transfer — this will cause the transfer to fail.

Dedicated

Transfer uses this runtime to connect to your data source via an internal or external network.

This runtime is useful when you need to use a specific network — it may be an external network or an internal network with a peer connection.

To run the test with this runtime:

  1. Under Runtime, select Dedicated.

  2. In the dropdown, select the network you want to use to connect to your data source.

  3. Click Test connection.

    After completing the test procedure, you'll see a list of the endpoint's data sources. For Apache Kafka® endpoints, you'll also see data samples for each data source.

Serverless

Transfer uses this runtime to connect to your data sources available from the internet via an automatically defined network.

Use this runtime to test an endpoint to a data source located outside isolated networks.

To run the test with this runtime:

  1. Under Runtime, select Serverless.

  2. Click Test connection.

    After completing the test procedure, you'll see a list of the endpoint's data sources. For Apache Kafka® endpoints, you'll also see data samples for each data source.

Warning

Testing a connection may take up to several minutes.

Update a source endpoint

Warning

You can't change the source type in an existing endpoint. If you want to transfer data from a source of a different type, create a new source endpoint of the desired type.

  1. Go to the Transfer page in the console and switch to the Endpoints tab.

  2. Select the endpoint you want to update.

  3. Click Edit at the top right of the page.

  4. Edit settings under Basic parameters and Endpoint parameters.

  5. Click Submit.

  1. Edit settings in the Terraform resource configuration file:

    resource "doublecloud_transfer_endpoint" "example-source-endpoint" {
      name = "example-source-endpoint"
      project_id = DOUBLECLOUD_PROJECT_ID     # Replace with your project ID
      settings {
        postgres_source {
          ...                                 # Update the desired settings
        }
      }
    }
    
  2. Apply the configuration:

    terraform apply
    

To update a source endpoint, use the EndpointService update method and pass the following parameters:

  • endpoint_id: ID of the endpoint you want to update. To find the endpoint ID, get a list of endpoints in the project.

  • name: New endpoint name. Must be unique within the project.

  • settings: Settings corresponding to the endpoint type, as reflected in the doublecloud.transfer.v1.EndpointSettings model.

Delete a source endpoint

Warning

Before you delete an endpoint, delete all the transfers that use it.

To delete an endpoint:

  1. Go to the Transfer page in the console and switch to the Endpoints tab.

  2. Select the endpoint you want to delete.

  3. At the top right of the page, click Delete.

  4. In the dialog that opened, confirm your action and click Delete.

To delete an endpoint:

  1. Comment out or delete the endpoint resource from the Terraform configuration files.

  2. Apply the configuration:

    terraform apply
    

Use the EndpointService delete method and pass the endpoint ID in the endpoint_id request parameter.

To find the endpoint ID, get a list of endpoints in the project.

github-mark-white

View this example on GitHub

from doublecloud.transfer.v1.endpoint_service_pb2 import DeleteEndpointRequest


def delete_endpoint(svc, endpoint_id: str):
   return svc.Delete(DeleteEndpointRequest(endpoint_id=endpoint_id))

For more in-depth examples, check out the DoubleCloud API Python SDK repository .

func createS3SourceEndpoint(ctx context.Context, dc *dc.SDK, flags *cmdFlags) (*operation.Operation, error) {
   op, err := dc.WrapOperation(dc.Transfer().Endpoint().Create(ctx, &transfer.CreateEndpointRequest{
      ProjectId: *flags.projectID,
      Name:      fmt.Sprint("s3-src-", *flags.name),
      Settings: &transfer.EndpointSettings{
         Settings: &transfer.EndpointSettings_S3Source{
            S3Source: &endpoint_airbyte.S3Source{
               Dataset:     "test",
               PathPattern: "test",
               Schema:      "test",
               Format:      &endpoint_airbyte.S3Source_Format{Format: &endpoint_airbyte.S3Source_Format_Csv{}},
               Provider:    &endpoint_airbyte.S3Source_Provider{Bucket: "test"},
            },
         },
      },
   }))
   if err != nil {
      return op, err
   }
   err = op.Wait(ctx)
   return op, err
}

For more in-depth examples, check out the DoubleCloud API Go SDK repository .