Migrating logs from Self-Hosted Elasticsearch to Elastic Cloud

Note: This guide is a WIP and I may extend as I gain more experience using Elastic Cloud.

Due to an increased in reliance on Logs, I was tasked with migrating an on-premisis ELK Cluster (https://www.elastic.co/) from a self-hosted open edition of the package, to a managed solution provided by Elastic.

I will not include the configuration of the ELK Cluster on Elastic Cloud in this guide as the setup wizard is pretty self explanitory. I will skip straight to creating a user account in Kibana which will be used for the data import..

Configuring Elastic Cloud for import:

Firstly, make a note of your Elasticsearch endpoint URL. To find this, open up your Deployment from https://cloud.elastic.co/deployments and click the link for “Copy Endpoint URL” next to Elasticsearch.

Next, open up Kibana by clicking the Launch button from your Deployment. We’ll now create a user with access to write to the Elasticsearch cluster. To do this, we’ll first create a Role with limited permissions for the upload. Click the cog at the bottom left hand corner and select “Roles” under Security.

Set an applicable name for the new Role (Eg elasticdump_uploader), next under “Index Privileges”, enter * in the Indices field (if appropriate) and all in the Privileges field (Adjust as appropriate).

We will now create a user account with this Role attached – Select “Users” under Security on the left hand side – Create a new User account with your new Role attached (make a note of the credentials).

You are now ready to transfer some logs from your Elasticsearch Server!

Uploading Logs:

To do the upload to Elastic Cloud, we will be using the elasticdump npm pacakge. If you do not have this installed, see here for instructions (Based on an Ubuntu installation).

Now, there is a for loop script below which will process through each of your indexes on your Elasticsearch server and upload these one at a time to Elastic Cloud – Creating indexes as it goes. Before proceeding with this, I would suggest to select a small index to use as a test before kicking off the big upload.

Firstly, use your Cloud Elasticsearch Endpoint URL and the username / password for your user account created in previous steps and combine these as follows:

[username]:[password]@[endpoint url]

eg:

elasticdump:[email protected]nd.io:9243

You can now use this as an output in Elasticdump to upload your test index:

elasticdump \
--input=http://localhost:9200/test-index \
--output=https://elasticdump:[email protected]nd.io:9243/test-index --limit 5000

Note: You should be careful with the –limit flag as it is possible to overload your Elasticsearch instance (Which will end badly if it is currently in use by your Team!)

Provided the above completed successfully and you are happy with the results, you can now proceed to upload the rest of your indexes. Depending on the size, I would suggest running the below in a screen session.

for index in $(curl -GET 'localhost:9200/_cat/indices/*?v&s=index' | grep -v "kibana" | awk {'print $3'}); do echo "elasticdump --input=http://localhost:9200/"$index" --output=https://elasticdump:[email protected]nd.io:9243/"$index""; done

To save any accidental uploads, the above will initially “echo” each of the upload commands for each index in your Elasticsearch database. If you are happy with the results, remove the “echo” from the command and your upload will begin.

Any comments or questions? Get in touch here or Email me at [email protected]