I have an e elasticsearch environment configured on GCE (Google Compute Engine) with two nodes, so there are two virtual machines and I need to create a backup strategy for this. At first I thought that I could use elasticsearch to back up all my data to a given storage, as the API supports several ways to store a snapshot.
- Common file system such as NAS
- Amazon s3
- HDFS (Hadoop Distributed File System)
- Sky blue cloud
I tried using the shared file system option, but this requires the storage location to be shared between nodes. Is there a way I can do this on GCE?
curl -XPUT http://xxxx:9200/_snapshot/backup -d '{ "type": "fs", "settings": { "compress" : true, "location": "/elasticsearch/backup" }
} '
nested: RepositoryVerificationException[[backup] store location [/elasticsearch/backup] is not shared between node
I know that for storing backups there is an AWS plugin for elasticsearch search . Is there a plugin for Google Cloud Storage? Is it possible to do this?
If any of the above options is not possible, is there any other recommended strategy for backing up my data?
elasticsearch google-compute-engine google-cloud-platform
Edmar miyake
source share