Data Migration from Digital Ocean Space to AWS S3

Samuel Ajisafe - Jul 13 - - Dev Community

This guide illustrates the migration process for moving objects from Digital Ocean (DO) Space to AWS S3

DigitalOcean Spaces provides Amazon S3-compatible object storage with a simplified pricing model. However, you may at some point find that you need to move your storage off of Spaces and onto Amazon S3. There are many tools that can be to achieve this purpose e.g. AWS Datasync, rclone, and some other third party tools, however using the AWS Datasync requires a complex setup and also come at price for the Data transfer and payment for the use of Datasync. In this guide we will be using rclone to automate the migration of data from Spaces to S3 quickly and easily.

Follow the below steps to setup and migrate data between DO space and AWS S3:
The first thing to do is to install rclone on the computer (Linux/Mac) from where the sync is going to run using the command below

curl https://rclone.org/install.sh | sudo bash
Enter fullscreen mode Exit fullscreen mode

Or, you can install on Mac using Homebrew:

brew install rclone
Enter fullscreen mode Exit fullscreen mode

On Windows systems, download and install the appropriate executable from the https://rclone.org/downloads/ Rclone site. Make sure to add rclone to your system's PATH afterward so that the subsequent commands in this tutorial work.

Obtaining Your Spaces Connection Information
To use Rclone to perform the copy, you'll need to create an rclone.conf file that enables Rclone to connect to both your AWS S3 bucket and to your Spaces space.

You will need two pieces of information from Spaces and also AWS Credentails:

The URL to the endpoint for your Space

An access key and secret key from DigitalOcean provides access to your Space.

AWS access Key and Secret Key

Obtaining your Spaces endpoint is easy: just navigate to your Space in DigitalOcean, where you'll see the URL for your Space. The endpoint you'll use is the regional endpoint without the name of your space (the part highlighted in the red rectangle below):

Image description

Also generate the Access and Secret Key on AWS IAM console, once gotten all the required information, do the following to configure rclone:

The following rclone configuration must then be added

mkdir -p ~/.config/rclone
vim ~/.config/rclone/rclone.conf
Enter fullscreen mode Exit fullscreen mode

In my case, we are going to generate two “Keys”, one for the Digital Ocean Storage and the other for the AWS S3.

[bucket-DO]
type = s3
env_auth = false
access_key_id = access_key_id
secret_access_key = secret_access_key+2xtDf01C++eF5WeJ0QXc
endpoint = endpoint.digitaloceanspaces.com
acl = private

[bucket-aws]
type = s3
env_auth = false
access_key_id = access_key_id
secret_access_key = secret_access_key+veqrlo8cPOz
region = us-east-1
acl = private
Enter fullscreen mode Exit fullscreen mode

Finally we will give the necessary permissions to the configuration file.

chmod 600 ~/.config/rclone/rclone.conf
Enter fullscreen mode Exit fullscreen mode

Validate that rclone can access the two remote storages

rclone listremotes
Enter fullscreen mode Exit fullscreen mode

We run the synchronisation of the storages, in our case from DigitalOcean to AWS.

rclone sync bucket-DO: bucket-aws:
Enter fullscreen mode Exit fullscreen mode

Congratulations the file synchronisation has now started, you can goto your AWS S3 Bucket to confirm the object synchronisation.

. . . . . . . . . . . . . . . .
Terabox Video Player