See full log file in ELK?

I setup ELK and it works great. you can see all the logs of the containers that running in the k8s cluster.

But how can I see in kubana or with logstash query all the record that related to specific container? and i want to see it as a whole file.


Do you mean that you want to extract all data that have been indexed in elasticsearch to a local file?
What for?

Depending on how you ingested, you can query with kibana using 'path', this should be possible with both the file and s3 inputs, both supply a path by default (although s3's is in [@metadata], from my recollection)

For example just do a query such as "path: "my-files/some-log-dir/some-file.log.gz""

However, I agree with @dadoonet, unless you are using this for a pretty narrow use case of manually verifying specific files are being ingested the way you expect, this is a weird use case.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.