Need Logstash logs like filebeat


(Nithin Venugopal) #1

Is there any way to get logstash to log how filebeat does as below

{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":0,"time":{"ms":9}},"total":{"ticks":10,"time":{"ms":23},"value":10},"user":{"ticks":10,"time":{"ms":14}}},"info":{"ephemeral_id":"2csdf7c-9825-4c32-a01f-0sdfsdfb9","uptime":{"ms":2869}},"memstats":{"gc_next":4194304,"memory_alloc":1644472,"memory_total":3801848,"rss":21393408}},"filebeat":{"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0},"reloads":1},"output":{"type":"elasticsearch"},"pipeline":{"clients":0,"events":{"active":0}}},"registrar":{"states":{"current":0},"writes":{"success":1,"total":1}},"system":{"cpu":{"cores":1},"load":{"1":0,"15":0,"5":0,"norm":{"1":0,"15":0,"5":0}}}}}}

Especially I am looking at "harvester":{"open_files":0,"running":0}. This is produced frequently by filebeat. Can logstash do the same? And no I don't want to use both filebeat and logstash


(Magnus B├Ąck) #2

I don't believe this is possible. What are you trying to accomplish?


(Nithin Venugopal) #3

While I was using filebeat, I used to monitor open and running files before stopping the service. In case of logstash, I need to access each file from sincedb, check if bytes matches the actual file and then stop the service (which is little complex).

My scenario: I have lot of docker based batch jobs. Each job runs in one container and produces lots of log files which has to be streamed to elasticsearch. Docker kills the container once the job is finished which deletes the log files as well. So I need an infinite loop checker to make sure all the files are transferred before the container is killed.


(Nithin Venugopal) #4

Any help on this?


(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.