Logstash logs does not appears in the index created by Filebeat


(Peter Szemesy) #1

Hi All,

I have installed the filebeat component for log/monitor purposes for Logstash activities as it was described under the Logs menu item at Kibana.
Maybe I have miss-configured something, but only the file opening and file closing activities are recorded in the filebeat-* indices (no pipeline related activities, nor start or stop related ones).

I use the standard filebeat configuration (filebeat modules enable logstash - obviously).
I did not modified anything in fileds.yml, and I have configured the elasticsearch connection and the kibana connection in filebeat .yml only.


(Pier-Hugues Pellerin) #2

Hello @pszemesy , if you look at the Logstash log are these message present? By default the logstash Filebeat module should read anything that is added to the log.


(Peter Szemesy) #3

Hi,
Yes, in the Logstash log everything appears as it should, so what I do not understand why these log entries does not in index.


(Pier-Hugues Pellerin) #4

Can you share your configuration?


(Peter Szemesy) #5

Please find it below (I have deleted the commented lines):
filebeat.yml:

filebeat.inputs:
- type: log
  enabled: false
  paths:
    - /var/log/*.log
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false
setup.template.settings:
  index.number_of_shards: 3
setup.kibana:
  host: "http://100.78.196.213:5601"
  username: "elastic"
  password: "********"
  protocol: "http"
output.elasticsearch:
  hosts: ["100.78.196.210:9200","100.78.196.211:9200","100.78.196.212:9200"]

  protocol: "http"
  username: "elastic"
  password: "********"

processors:
  - add_host_metadata: ~
  - add_cloud_metadata: ~

modules.d/logstash.yml:

  - module: logstash
    log:
     enabled: true
    slowlog:
     enabled: false

The others (fileds.yml and filebeat_reference.yml) are the standards from the installation.


(Pier-Hugues Pellerin) #6

Looking at your configuration everything appears normal on my side, The logstash module, should be take by default log in /var/log/logstash/logstash-plain*.log, Is there any error in the Filebeat log?


(Peter Szemesy) #7

Dear Pier,
Sorry for the late answer.
The modules.d/logstash.yml:
- module: logstash
# logs
log:
enabled: true

    # Set custom paths for the log files. If left empty,
    # Filebeat will choose the paths depending on your OS.
    var.paths: [/data/logstash/log/logstash-plain.log]

  # Slow logs
  slowlog:
   enabled: false
    # Set custom paths for the log files. If left empty,
    # Filebeat will choose the paths depending on your OS.
    #var.paths:

While in the filebeat log I found the followings:

2019-01-28T10:41:13.130+0100    INFO    log/input.go:138        Configured paths: [/data/logstash/log/logstash-plain.log]
2019-01-28T10:41:13.130+0100    INFO    crawler/crawler.go:106  Loading and starting Inputs completed. Enabled inputs: 0
2019-01-28T10:41:13.130+0100    INFO    cfgfile/reload.go:150   Config reloader started
2019-01-28T10:41:13.132+0100    INFO    log/input.go:138        Configured paths: [/data/logstash/log/logstash-plain.log]
2019-01-28T10:41:13.132+0100    INFO    elasticsearch/client.go:163     Elasticsearch url: http://xxx.xxx.xxx.xxx:9200
2019-01-28T10:41:13.133+0100    INFO    elasticsearch/client.go:163     Elasticsearch url: http://xxx.xxx.xxx.xxx:9200
2019-01-28T10:41:13.133+0100    INFO    elasticsearch/client.go:163     Elasticsearch url: http://xxx.xxx.xxx.xxx:9200
2019-01-28T10:41:13.134+0100    INFO    elasticsearch/client.go:712     Connected to Elasticsearch version 6.5.3
2019-01-28T10:41:13.136+0100    INFO    input/input.go:114      Starting input of type: log; ID: 5215442098622615622
2019-01-28T10:41:13.136+0100    INFO    cfgfile/reload.go:205   Loading of config files completed.
2019-01-28T10:41:43.130+0100    INFO    [monitoring]    log/log.go:144  Non-zero metrics in the last 30s        {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":10,"time":{"ms":16}},"total":{"ticks":30,"time":{"ms":36},"value":30},"user":{"ticks":20,"time":{"ms":20}}},"handles":{"limit":{"hard":4096,"soft":1024},"open":7},"info":{"ephemeral_id":"18897326-61a8-411f-8ee1-922298ec8897","uptime":{"ms":33017}},"memstats":{"gc_next":4194304,"memory_alloc":3426200,"memory_total":5065968,"rss":15376384}},"filebeat":{"events":{"added":2,"done":2},"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0},"reloads":1},"output":{"type":"elasticsearch"},"pipeline":{"clients":2,"events":{"active":0,"filtered":2,"total":2}}},"registrar":{"states":{"current":1,"update":2},"writes":{"success":2,"total":2}},"system":{"cpu":{"cores":2},"load":{"1":0,"15":0.05,"5":0.01,"norm":{"1":0,"15":0.025,"5":0.005}}}}}}
2019-01-28T10:41:53.138+0100    INFO    log/harvester.go:254    Harvester started for file: /data/logstash/log/logstash-plain.log
2019-01-28T10:41:54.138+0100    INFO    pipeline/output.go:95   Connecting to backoff(elasticsearch(http://100.78.196.211:9200))
2019-01-28T10:41:54.139+0100    INFO    pipeline/output.go:95   Connecting to backoff(elasticsearch(http://100.78.196.212:9200))
2019-01-28T10:41:54.139+0100    INFO    pipeline/output.go:95   Connecting to backoff(elasticsearch(http://100.78.196.210:9200))
2019-01-28T10:41:54.140+0100    INFO    elasticsearch/client.go:712     Connected to Elasticsearch version 6.5.3
2019-01-28T10:41:54.141+0100    INFO    elasticsearch/client.go:712     Connected to Elasticsearch version 6.5.3
2019-01-28T10:41:54.141+0100    INFO    elasticsearch/client.go:712     Connected to Elasticsearch version 6.5.3
2019-01-28T10:41:54.153+0100    INFO    template/load.go:129    Template already exists and will not be overwritten.
2019-01-28T10:41:54.153+0100    INFO    pipeline/output.go:105  Connection to backoff(elasticsearch(http://100.78.196.211:9200)) established
2019-01-28T10:41:54.156+0100    INFO    template/load.go:129    Template already exists and will not be overwritten.
2019-01-28T10:41:54.157+0100    INFO    pipeline/output.go:105  Connection to backoff(elasticsearch(http://100.78.196.210:9200)) established
2019-01-28T10:41:54.159+0100    INFO    template/load.go:129    Template already exists and will not be overwritten.
2019-01-28T10:41:54.160+0100    INFO    pipeline/output.go:105  Connection to backoff(elasticsearch(http://100.78.196.212:9200)) established
2019-01-28T10:42:13.130+0100    INFO    [monitoring]    log/log.go:144  Non-zero metrics in the last 30s        {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":20,"time":{"ms":5}},"total":{"ticks":50,"time":{"ms":18},"value":50},"user":{"ticks":30,"time":{"ms":13}}},"handles":{"limit":{"hard":4096,"soft":1024},"open":11},"info":{"ephemeral_id":"18897326-61a8-411f-8ee1-922298ec8897","uptime":{"ms":63017}},"memstats":{"gc_next":4269696,"memory_alloc":3092864,"memory_total":6596712,"rss":602112}},"filebeat":{"events":{"added":4,"done":4},"harvester":{"open_files":1,"running":1,"started":1}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"acked":3,"batches":3,"total":3},"read":{"bytes":4178},"write":{"bytes":4690}},"pipeline":{"clients":2,"events":{"active":0,"filtered":1,"published":3,"retry":3,"total":4},"queue":{"acked":3}}},"registrar":{"states":{"current":1,"update":4},"writes":{"success":4,"total":4}},"system":{"load":{"1":0,"15":0.05,"5":0.01,"norm":{"1":0,"15":0.025,"5":0.005}}}}}}

The logstash log contains some lines from the last few minutes:

[2019-01-28T10:48:43,732][INFO ][logstash.outputs.file    ] Closing file /data/logdebug/ado_28/mark_mark
[2019-01-28T10:48:45,575][INFO ][logstash.outputs.file    ] Closing file /data/logdebug/keret_28/logstashfail
[2019-01-28T10:51:07,863][INFO ][logstash.outputs.file    ] Opening file {:path=>"/data/logdebug/gazd_28/alert_1"}
[2019-01-28T10:51:44,275][INFO ][logstash.outputs.file    ] Closing file /data/logdebug/gazd_28/alert_1
[2019-01-28T10:51:51,070][INFO ][logstash.outputs.file    ] Closing file /data/logdebug/ado_28/tmp_notjson
[2019-01-28T10:51:51,072][INFO ][logstash.outputs.file    ] Opening file {:path=>"/data/logdebug/ado_28/mark_mark"}
[2019-01-28T10:52:32,254][INFO ][logstash.outputs.file    ] Opening file {:path=>"/data/logdebug/gazd_28/alert_1"}
[2019-01-28T10:52:45,229][INFO ][logstash.outputs.file    ] Closing file /data/logdebug/gazd_28/alert_1
[2019-01-28T10:56:50,453][ERROR][logstash.inputs.tcp      ] Error in Netty pipeline: java.io.IOException: Connection reset by peer
[2019-01-28T10:57:54,349][INFO ][logstash.outputs.file    ] Opening file {:path=>"/data/logdebug/keret_28/logstashfail"}

But on Kibana I can see no log entries.


(Peter Szemesy) #8

It is solved!
It was a yaml formatting mistake.


(system) closed #9

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.