How can i extract data from elasticsearch

I need to extract/export all winlogbeat data from once elk instance and reload/replay it into another for testing. What is the best and easiest way to get this done.

You could use the Reindex from Remote option explained in this document.

This is nice, i guess though what i am trying to do, is i need to pull data from the elk data for a set of computernames and from winlogbeat only, then input that into excel or just a text file so i can then feed it into and event server for testing. i have been trying to use the reindex, and even the query language, but i can not get it to see the systems, an di know they are there. I know sql but this is not sql, any ideas or hints, i can script it up in bash or python if needed.

The initial question sounded like you were trying to index from one cluster
to another this seems a little different. Yes the queries can be a bit
confusing if you are coming from SQL, when I first started with Elastic I
had been doing only SQL for a long time.

Kibana is a good reference for building a query and then then seeing how
that query was built. below is a very simple query to pull documents for
one host from a logstash index.

curl -XGET "http://localhost:9200/logstash-2018*/_search" -H 'Content-Type:
application/json' -d'
    "query": {
        "match" : {
            "host" : "yourhostname"

To get every document you will have to use scroll mode in your client
otherwise it will only return a maximum of 10000 documents if you use
"count":10000 Once you pull the data with your client then its just in a
JSON object that you can manipulate however you want. Most clients support
a similar query syntax.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.