elasticsearch-dump

Import and export tools for elasticsearch & opensearch

APACHE-2.0 License

Downloads
77.5K
Stars
7.4K
Committers
124

Bot releases are hidden (Show)

elasticsearch-dump - Custom Certs for MutliElasticdump

Published by ferronrsmith over 5 years ago

Special thanks to @admlko

elasticsearch-dump - Elasticsearch 7

Published by ferronrsmith over 5 years ago

  • Added support for Elasticsearch 7.

NB: Please remember types has been deprecated in Elasticsearch 7

elasticsearch-dump - MultiElasticdump Improvements

Published by ferronrsmith over 5 years ago

  • Added dump/load index settings for multielasticdump
  • Added dump/load index template for multielasticdump

Thanks @ilyaTT

elasticsearch-dump - s3 Transport Goodies

Published by ferronrsmith over 5 years ago

Breaking Change !

This release contains a breaking change for the s3 transport.
s3Bucket and s3RecordKey params are no longer supported please use s3urls instead

# Import data from S3 into ES (using s3urls)
elasticdump \
  --s3AccessKeyId "${access_key_id}" \
  --s3SecretAccessKey "${access_key_secret}" \
  --input "s3://${bucket_name}/${file_name}.json" \
  --output=http://production.es.com:9200/my_index

# Export ES data to S3 (using s3urls)
elasticdump \
  --s3AccessKeyId "${access_key_id}" \
  --s3SecretAccessKey "${access_key_secret}" \
  --input=http://production.es.com:9200/my_index \
  --output "s3://${bucket_name}/${file_name}.json"

Thanks @suppenkelch for your contribution

elasticsearch-dump - Features

Published by ferronrsmith over 5 years ago

  • ✨ added transform for multi-elasticdump dump
  • ✨ added support for prefix/suffix/big-int for multi-elasticdump dump
elasticsearch-dump - Big Integer Fixes

Published by ferronrsmith over 5 years ago

  • better decimal/floating point number detection for big-int
  • scroll uses of jsonParser now passing parent object
  • strip leading + from positive numbers
  • more tests for big integers
elasticsearch-dump - s3 Compression

Published by ferronrsmith over 5 years ago

  • added new s3Compress flag that GZIPs stream being sent to s3
elasticsearch-dump - Socks Docs

Published by ferronrsmith over 5 years ago

  • added socks docs
elasticsearch-dump - Adding some SOCKS !

Published by ferronrsmith over 5 years ago

  • Add socks5 support. Thanks @fabre-thibaud
elasticsearch-dump - Feature

Published by ferronrsmith over 5 years ago

  • added support for support-big-int to multielasticdump
elasticsearch-dump - Features

Published by ferronrsmith over 5 years ago

  • Added support for transform to multielasticdump
  • convert elasticdump + multielasticdump to ES6
elasticsearch-dump - Features

Published by ferronrsmith almost 6 years ago

  • Allow TLS X509 client authentication @morningspace
  • Add region support to fix China Region S3 upload issue
  • Fixed an issue with detecting custom transports.
  • README edits/typos
elasticsearch-dump - Features & Fixes

Published by ferronrsmith almost 6 years ago

  • allow parseExtraFields + default metafields to be processed at the same time
  • remove empty values from parseExtraFields
  • added support for exporting aliases via multi-elastic dump
  • proper handling of error when err object null, but status is not 200
elasticsearch-dump - s3 Transport Fixes

Published by ferronrsmith almost 6 years ago

Fixed #487 - s3 doesn't write newlines between events

elasticsearch-dump - v5.0.1: `--limit` fix

Published by evantahler almost 6 years ago

https://github.com/taskrabbit/elasticsearch-dump/pull/28

Now that we are using the scan/scroll API to load data from Elasticsearch, we need to modify how the flag --limit is treated in reads.

In most Elasticsearch APIs, limit is literal, in that if you say {size: 100}, you will get 100 results. However, the scan/scroll API is special, in that it tries to minimize load on each shard and does not pre-collect results before transmitting. The size in this API is actually results per-shard. So if you have 5 shards and say {size: 100}, you will actually get ~500 results back (assuming the shard has unsent data to return).

This PR attempts to look up how many shards and index has, and will modify the effective {size} to be limit / shards.

elasticsearch-dump - s3 Transport

Published by ferronrsmith almost 6 years ago

Added s3 transport support.

NB : Only the output (set) has been implemented.

Thanks to @hilt86 for testing and providing an s3 bucket implementation !

elasticsearch-dump -

Published by ferronrsmith about 6 years ago

elasticsearch-dump - Quick Fix

Published by ferronrsmith about 6 years ago

  • 4.0.1 - was a quick fix released to fix a critical npmignore issue
  • 4.0.2 - converts files to es6. Not all files have been converted, but the work has been started !

The package size has been reduced by over 300%

elasticsearch-dump - File Splitting

Published by ferronrsmith about 6 years ago

The work done on the stream splitter should drastically improve the efficiently of dumping to files while helping to mitigate the pesky out of memory exception.

That alone is work a MAJOR bump.

A new --fileSize flag was added that allows users to specify the file-size of each outputted chunk.
Under the covers elasticsearch-dump uses bytes to convert the abbreviated string represent to bytes that can be used by the new string splitter class.

Usage Examples

--fileSize=10mb // split the file every 10 megabytes
--fileSize=1gb // split the file every 1 gigabytes.

Remember the higher the fileSize the higher the risk at hitting a out of memory issue. Perform your own testing to see what you system is able to handle.

Thanks for using elasticsearch-dump

elasticsearch-dump - Features

Published by ferronrsmith about 6 years ago

  • added request retry support. elasticdump now support retrying request. using the retryAttempts flag to control the amount of times to retry and the retryDelay to set the back-off time.
  • added support for passing extra meta fields using the parseExtraFields flag