How export logs from Kibana 4?

(Dnyaneshwar Sonawane) #1

I have tried the export button in kibana 4, but it export the table with timestamp and count column.. there is not logs.

How export logs from Kibana 4 ?


(Tyler Langlois) #2

When you look at the export for a table in Kibana, you're seeing what Elasticsearch returned to Kibana in answer to a specific query - Kibana can ask to get back a histogram, for example, so the export is just a dump of that summarized data that Kibana got back from a query like that.

If you're looking to actually export logs from Elasticsearch, you probably want to save them somewhere, so viewing them in the browser probably isn't the best way to view hundreds or thousands of logs. There are a couple of options here:

  • In the "Discover" tab, you can click on the arrow tab near the bottom to see the raw request and response. You could click "Request" and use that as a query to ES with curl (or something similar) to query ES for the logs you want.
  • You could use logstash or stream2es to dump out the contents of a index (with possible query parameters to get the specific documents you want.)

Download Logs from ELK
(Diego Gullo) #3

Does this button still exists on the latest Kibana? I cannot see it. Anyway to enable it rather than using logstash plugins?

(Tyler Langlois) #4

It's still there. To view the underlying REST request and response that Kibana uses to fetch the raw data use this popup arrow:

Which will expose the following panel:

Clicking on "Request" will show you the REST API request used to retrieve the search results, and clicking on "Response" will show the raw response from Elasticsearch itself.

(Eykilles) #5

How can I export raw response from Elasticsearch?

(Tyler Langlois) #6

The simplest way would be to find the query you need from Kibana (or just query for all documents in the index you're interested in) and run it in a small script to export documents.


Hi Tyler,

Could you add some detail about how to run a small script to export documents? Currently the only way I can download search results that appear in the sample data section in Discover is to manually highlight them all, copy and then paste everything into Excel.

Thank you!

(Tyler Langlois) #8

@LoriD everybody's needs are different, and to export logs generally out of Elasticsearch, you would need to retrieve all documents from an index which you could then put into whatever system you ultimately need the documents in. Tools like Logstash can transform data into forms you need in a powerful way, but to simply dump all documents from an index, you would need to open a scroll and retrieve all documents from the index. At the end of the day, you would end up with many json documents.

The best detail I can give is an example script that would do what I'm describing. The following bash script would dump all documents in an index for the last 15 minutes, effectively creating a dump.json file that contains all the documents that you'd normally find in a default kibana dashboard (I use the httpie and jq utilities in this script):

#!/usr/bin/env bash

usage="Usage: ${0} <elasticsearch url> <index>"
: ${1?:$usage}
: ${2?:$usage}

initial=$(http -b $es/${2}/_search scroll==1m size==500 q=="@timestamp:>=now-15m")
scroll_id=$(echo $initial | jq -r '."_scroll_id"')
echo "Scroll id: ${scroll_id}"
hits=$(echo $initial | jq -r '.hits.hits | length')
echo $initial | jq '.hits.hits[]' > dump.json

until [[ $hits -eq 0 ]]
    results=$(http -b $es/_search/scroll scroll=1m scroll_id="${scroll_id}")
    echo $results | jq '.hits.hits[]' >> dump.json
    hits=$(echo $results | jq -r '.hits.hits | length')

http -b DELETE $es/_search/scroll scroll_id=$scroll_id

That's a very simple example of how to use the scroll API to retrieve all documents for a given index.