I have tried the export button in kibana 4, but it export the table with timestamp and count column.. there is not logs.
How export logs from Kibana 4 ?
Thanks
I have tried the export button in kibana 4, but it export the table with timestamp and count column.. there is not logs.
How export logs from Kibana 4 ?
Thanks
When you look at the export for a table in Kibana, you're seeing what Elasticsearch returned to Kibana in answer to a specific query - Kibana can ask to get back a histogram, for example, so the export is just a dump of that summarized data that Kibana got back from a query like that.
If you're looking to actually export logs from Elasticsearch, you probably want to save them somewhere, so viewing them in the browser probably isn't the best way to view hundreds or thousands of logs. There are a couple of options here:
curl
(or something similar) to query ES for the logs you want.Does this button still exists on the latest Kibana? I cannot see it. Anyway to enable it rather than using logstash plugins?
It's still there. To view the underlying REST request and response that Kibana uses to fetch the raw data use this popup arrow:
Which will expose the following panel:
Clicking on "Request
" will show you the REST API request used to retrieve the search results, and clicking on "Response
" will show the raw response from Elasticsearch itself.
How can I export raw response from Elasticsearch?
The simplest way would be to find the query you need from Kibana (or just query for all documents in the index you're interested in) and run it in a small script to export documents.
Hi Tyler,
Could you add some detail about how to run a small script to export documents? Currently the only way I can download search results that appear in the sample data section in Discover is to manually highlight them all, copy and then paste everything into Excel.
Thank you!
@LoriD everybody's needs are different, and to export logs generally out of Elasticsearch, you would need to retrieve all documents from an index which you could then put into whatever system you ultimately need the documents in. Tools like Logstash can transform data into forms you need in a powerful way, but to simply dump all documents from an index, you would need to open a scroll and retrieve all documents from the index. At the end of the day, you would end up with many json documents.
The best detail I can give is an example script that would do what I'm describing. The following bash script would dump all documents in an index for the last 15 minutes, effectively creating a dump.json
file that contains all the documents that you'd normally find in a default kibana dashboard (I use the httpie and jq utilities in this script):
#!/usr/bin/env bash
usage="Usage: ${0} <elasticsearch url> <index>"
: ${1?:$usage}
: ${2?:$usage}
es=${1}
initial=$(http -b $es/${2}/_search scroll==1m size==500 q=="@timestamp:>=now-15m")
scroll_id=$(echo $initial | jq -r '."_scroll_id"')
echo "Scroll id: ${scroll_id}"
hits=$(echo $initial | jq -r '.hits.hits | length')
echo $initial | jq '.hits.hits[]' > dump.json
until [[ $hits -eq 0 ]]
do
results=$(http -b $es/_search/scroll scroll=1m scroll_id="${scroll_id}")
echo $results | jq '.hits.hits[]' >> dump.json
hits=$(echo $results | jq -r '.hits.hits | length')
done
http -b DELETE $es/_search/scroll scroll_id=$scroll_id
That's a very simple example of how to use the scroll API to retrieve all documents for a given index.
Hi Tyler,
Is it possible to pull out error logs only from Kibana?
Please open a new thread with your question.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.