Kibana output port not accepted in filebeat.yml

I am trying to use filebeats to ingest logs and send them to Kibana. Here is my filebeat.yml file

filebeat:
inputs:
-
paths:
- /home/mehak/Downloads/Dispatcher-ExtTicketId-24748775/Dispstcher-ExtTicketId-24748775/log.txt(2)/logz.log
- /home/mehak/Downloads/Dispatcher-ExtTicketId-24748775/Dispstcher-ExtTicketId-24748775/log.txt(2)/log2.log
input_type: log
multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}'
multiline.negate: true
multiline.match: after

output:
kibana:
hosts: ["localhost:5601"]

And error is-
2019-11-06T13:55:33.350-0800 ERROR instance/beat.go:878 Exiting: error initializing publisher: output type kibana undefined
Exiting: error initializing publisher: output type kibana undefined

Hey!

Filebeat is pushing data to Elasticsearch and not Kibana. Please have a look into the documentation at https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-configuration.html and let us know if you meet any problems.:slightly_smiling_face:

Thank you!

Hi Chris,

I went through the documentation and made two changes:

  1. Added setup.kibana
    filebeat:
    inputs:

    paths:
    - /home/mehak/Downloads/Dispatcher-ExtTicketId-24748775/Dispstcher-ExtTicketId-24748775/log.txt(2)/logz.log
    - /home/mehak/Downloads/Dispatcher-ExtTicketId-24748775/Dispstcher-ExtTicketId-24748775/log.txt(2)/log2.log
    input_type: log
    multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}'
    multiline.negate: true
    multiline.match: after

output:
elasticsearch:
hosts: ["localhost:9200"]
setup:
kibana:
hosts: ["localhost:5601"]

  1. ./filebeat test config -e ran this command and got Config OK as result

Still when I run GET _cat/indices?v command on Kibana's dev tool, i dont see any new index made today!

This is the warning on kibana terminal-
log [18:52:23.752] [info][status][plugin:file_upload@7.4.0] Status changed from yellow to green - Ready
log [18:52:23.752] [info][status][plugin:snapshot_restore@7.4.0] Status changed from yellow to green - Ready
log [18:52:23.752] [info][kibana-monitoring][monitoring] Monitoring status upload endpoint is not enabled in Elasticsearch:Monitoring stats collection is stopped
log [18:52:23.767] [info][status][plugin:maps@7.4.0] Status changed from yellow to green - Ready
log [18:52:24.369] [warning][reporting] Generating a random key for xpack.reporting.encryptionKey. To prevent pending reports from failing on restart, please set xpack.reporting.encryptionKey in kibana.yml
log [18:52:24.373] [info][status][plugin:reporting@7.4.0] Status changed from uninitialized to green - Ready
log [18:52:24.561] [info][listening] Server running at http://localhost:5601
log [18:52:24.575] [info][server][Kibana][http] http server running at http://localhost:5601
log [18:52:24.694] [info][status][plugin:spaces@7.4.0] Status changed from yellow to green - Ready
log [18:52:24.745] [warning][maps] Error scheduling telemetry task, received [task:Maps-maps_telemetry]: version conflict, document already exists (current version [31]): [version_conflict_engine_exception] [task:Maps-maps_telemetry]: version conflict, document already exists (current version [31]), with { index_uuid="vpLXqEc_QlOXhqnPG1bOeQ" & shard="0" & index=".kibana_task_manager_1" }
log [18:52:24.746] [warning][telemetry] Error scheduling task, received [task:oss_telemetry-vis_telemetry]: version conflict, document already exists (current version [7]): [version_conflict_engine_exception] [task:oss_telemetry-vis_telemetry]: version conflict, document already exists (current version [7]), with { index_uuid="vpLXqEc_QlOXhqnPG1bOeQ" & shard="0" & index=".kibana_task_manager_1" }

Hi!

Can you execute filebeat in debug mode and look for errors there?

Like ./filebeat -e -d "*".

Try to see if something goes wrong there and if events can successfully be pushed to Elasticsearch.

Also could you provide your full configuration, filebeat.yml? (Try to format it by surrounding it triple "`")

Hi,

This is the command I am running ./filebeat -e -v -d "*" -c filebeat.yml.

The filebeat.yml file is like this (PS: thanks about the triple tick suggestion, i didnt know about it)

filebeat:
 inputs: 
   -
     paths:
       - /home/mehak/Downloads/Dispatcher-ExtTicketId-24748775/Dispstcher-ExtTicketId-24748775/log.txt(2)/logz.log
       - /home/mehak/Downloads/Dispatcher-ExtTicketId-24748775/Dispstcher-ExtTicketId-24748775/log.txt(2)/log2.log
     input_type: log
     multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}'
     multiline.negate: true
     multiline.match: after
 
output:
 elasticsearch:
   hosts: ["localhost:9200"]
setup:
 kibana:
   hosts: ["localhost:5601"]  

The output doesnt show any errors but the last 10 lines are shown in a loop. I can see the filebeat has started but not sure why registrar/migrate file has false shown.

2019-11-07T11:21:19.518-0800 INFO instance/beat.go:422 filebeat start running.
2019-11-07T11:21:19.518-0800 INFO [monitoring] log/log.go:118 Starting metrics logging every 30s
2019-11-07T11:21:19.523-0800 DEBUG [test] registrar/migrate.go:159 isFile(/home/mehak/Documents/filebeat-7.4.0-linux-x86_64/data/registry) -> false
2019-11-07T11:21:19.523-0800 DEBUG [test] registrar/migrate.go:159 isFile() -> false
2019-11-07T11:21:19.524-0800 INFO log/input.go:152 Configured paths: [/home/mehak/Downloads/Dispatcher-ExtTicketId-24748775/Dispstcher-ExtTicketId-24748775/log.txt(2)/logz.log /home/mehak/Downloads/Dispatcher-ExtTicketId-24748775/Dispstcher-ExtTicketId-24748775/log.txt(2)/log2.log]
2019-11-07T11:21:19.524-0800 INFO input/input.go:114 Starting input of type: log; ID: 4880385400357078809
2019-11-07T11:21:19.524-0800 INFO crawler/crawler.go:106 Loading and starting Inputs completed. Enabled inputs: 1
2019-11-07T11:21:19.524-0800 DEBUG [input] log/input.go:191 Start next scan
2019-11-07T11:21:19.524-0800 DEBUG [input] log/input.go:212 input states cleaned up. Before: 0, After: 0, Pending: 0
2019-11-07T11:21:29.526-0800 DEBUG [input] input/input.go:152 Run input
2019-11-07T11:21:29.526-0800 DEBUG [input] log/input.go:191 Start next scan
2019-11-07T11:21:29.526-0800 DEBUG [input] log/input.go:212 input states cleaned up. Before: 0, After: 0, Pending: 0
2019-11-07T11:21:39.527-0800 DEBUG [input] input/input.go:152 Run input
2019-11-07T11:21:39.527-0800 DEBUG [input] log/input.go:191 Start next scan
2019-11-07T11:21:39.527-0800 DEBUG [input] log/input.go:212 input states cleaned up. Before: 0, After: 0, Pending: 0
2019-11-07T11:21:49.525-0800 INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":20,"time":{"ms":22}},"total":{"ticks":30,"time":{"ms":38},"value":0},"user":{"ticks":10,"time":{"ms":16}}},"handles":{"limit":{"hard":4096,"soft":1024},"open":5},"info":{"ephemeral_id":"30dfdaa0-8582-42c4-b43a-11311914f54a","uptime":{"ms":30028}},"memstats":

Thank you!

If I know filebeats is running here, how do I see its taking the right log files as mentioned in the path?

Thanks!

It seems that the paths are correctly configured:

Seen at:

2019-11-07T11:21:19.524-0800 INFO log/input.go:152 Configured paths: [/home/mehak/Downloads/Dispatcher-ExtTicketId-24748775/Dispstcher-ExtTicketId-24748775/log.txt(2)/logz.log /home/mehak/Downloads/Dispatcher-ExtTicketId-24748775/Dispstcher-ExtTicketId-24748775/log.txt(2)/log2.log]

One thing you can do is to try with a minimal configuration like this one at https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-log.html#filebeat-input-log. In this way you will be able to identify if the you can collect logs and then to incrementally add the multiline configs and what might cause the problem. :slightly_smiling_face:

While I look into multiline, I also think there might be an issue with the log file structure. Because I know the pipelines are created but data isn't going through, and hence no index is created on Kibana console.

08/10/2019 12:14:48 599   (null)                 DEBUG   27   GetIncidentTime for Incident Id xxxxxxxx on thread 04fd1833-8275-46ff-816f-9acf0c1f7724:80759 on Thread 27

And the location of these log files dont have to be inside the filebeats folder right?

No need for a specific location. Log files can be everywhere, for instance /var/log/system.log.

@ChrsMark I ran this command suggested on filebeat documentation-

./filebeat -e --modules system,nginx,mysql

An index is created in Kibana but I am afraid it doesnt have the logs I have defined in my path. Below is my filebeat.yml

filebeat:
  inputs: 
    -
      paths:
        - /home/mehak/Downloads/Dispatcher-ExtTicketId-24748775/Dispstcher-ExtTicketId-24748775/log.txt(2)/logz.log
        #- /home/mehak/Downloads/Dispatcher-ExtTicketId-24748775/Dispstcher-ExtTicketId-24748775/log.txt(2)/log2.log
      input_type: log
      multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}'
      multiline.negate: true
      multiline.match: after
  
output:
  elasticsearch:
    hosts: ["localhost:9200"]
    index: "filebeat-%{[beat.version]}-%{+yyyy.MM.dd}"
setup:
  kibana:
    hosts: ["localhost:5601"]
  template:
    name: "filebeat"
    pattern: "filebeat-*"

Terminal output shows config path:

|2019-11-08T12:26:41.810-0800|INFO|log/input.go:152|Configured paths: [/home/mehak/Downloads/Dispatcher-ExtTicketId-24748775/Dispstcher-ExtTicketId-24748775/log.txt(2)/logz.log]|
|---|---|---|---|
|2019-11-08T12:26:41.810-0800|INFO|input/input.go:114|Starting input of type: log; ID: 4538948564200774801 |
|2019-11-08T12:26:41.810-0800|INFO|log/input.go:152|Configured paths: [/var/log/auth.log* /var/log/secure*]|
|2019-11-08T12:26:41.810-0800|INFO|input/input.go:114|Starting input of type: log; ID: 15011157445397900703 |
|2019-11-08T12:26:41.810-0800|INFO|log/input.go:152|Configured paths: [/var/log/messages* /var/log/syslog*]|
|2019-11-08T12:26:41.810-0800|INFO|input/input.go:114|Starting input of type: log; ID: 6299065798408291361 |
|2019-11-08T12:26:41.811-0800|INFO|log/input.go:152|Configured paths: [/var/log/nginx/access.log*]|
|2019-11-08T12:26:41.811-0800|INFO|input/input.go:114|Starting input of type: log; ID: 5161657321392416363 |
|2019-11-08T12:26:41.811-0800|INFO|log/input.go:152|Configured paths: [/var/log/nginx/error.log*]|
|2019-11-08T12:26:41.811-0800|INFO|input/input.go:114|Starting input of type: log; ID: 13602457330663296174 |
|2019-11-08T12:26:41.812-0800|INFO|log/input.go:152|Configured paths: [/var/log/mysql/mysql-slow.log* /var/lib/mysql/mehak-VirtualBox-slow.log]|
|2019-11-08T12:26:41.812-0800|INFO|input/input.go:114|Starting input of type: log; ID: 8127777535050556082 |
|2019-11-08T12:26:41.812-0800|INFO|log/input.go:152|Configured paths: [/var/log/mysql/error.log* /var/log/mysqld.log*]|

This command will enable system, nginx and mysql modules. Since you want to collect logs only from your desired path you should run filebeat like ./filebeat -e -d "*".

Thanks for addressing, I ran the above command. No new index is being created. I changed the index name from filebeat ( which was created earlier by using the modules command ) and my updated output in filebeat.yml looks like this

output:
  elasticsearch:
    hosts: ["localhost:9200"]
    index: "file-%{[beat.version]}-%{+yyyy.MM.dd}"
setup:
  kibana:
    hosts: ["localhost:5601"]
  template:
    name: "file"
    pattern: "file-*"

Previous filebeat index created

yellow filebeat-7.4.0-2019.11.08-000001 open lMcCHhWuT9ecTfsI4OyGEA 1 1   3982      0   676kb 2019-11-08T20:06:41.949Z

Unless index is created, i cannot check if right path logs are passed.

Just try with a simple configuration:

filebeat.inputs:
- type: log
  paths:
    - /home/mehak/Downloads/Dispatcher-ExtTicketId-24748775/Dispstcher-ExtTicketId-24748775/log.txt(2)/logz.log

output:
  elasticsearch:
    hosts: ["localhost:9200"]
setup:
  kibana:
    hosts: ["localhost:5601"]

If the file you set is being populated with logs (you can check it while filebeat is running with a tail command) then you should see the events to be logged in the console you run filebeat.

Thanks.

Do you mean assign true to tail command in filebeat.inputs like this-

filebeat.inputs:
- type: log
  paths:
    - /home/mehak/Downloads/Dispatcher-ExtTicketId-24748775/Dispstcher-ExtTicketId-24748775/log.txt(2)/logz.log
  tail: true

Events have been logging on the filebeat console like this-

2019-11-08T15:00:07.628-0800	INFO	[monitoring]	log/log.go:145	Non-zero metrics in the last 30s	{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":40,"time":{"ms":1}},"total":{"ticks":90,"time":{"ms":6},"value":90},"user":{"ticks":50,"time":{"ms":5}}},"handles":{"limit":{"hard":4096,"soft":1024},"open":5},"info":{"ephemeral_id":"4a57a047-626e-4b9f-b358-3d40113a83c1","uptime":{"ms":450017}},"memstats":{"gc_next":7135376,"memory_alloc":4195752,"memory_total":14031184},"runtime":{"goroutines":20}},"filebeat":{"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":1,"events":{"active":0}}},"registrar":{"states":{"current":0}},"system":{"load":{"1":0.23,"15":0.45,"5":0.32,"norm":{"1":0.0575,"15":0.1125,"5":0.08}}}}}}
2019-11-08T15:00:07.689-0800	DEBUG	[input]	input/input.go:152	Run input
2019-11-08T15:00:07.689-0800	DEBUG	[input]	log/input.go:191	Start next scan
2019-11-08T15:00:07.689-0800	DEBUG	[input]	log/input.go:212	input states cleaned up. Before: 0, After: 0, Pe

Hi,

I mean to run a tail -f /home/mehak/Downloads/Dispatcher-ExtTicketId-24748775/Dispstcher-ExtTicketId-24748775/log.txt(2)/logz.log(http://man7.org/linux/man-pages/man1/tail.1.html) and see if there are new logs in the file while filebeat is running. The output you provide shows that nothing goes wrong and that there is nothing to parse.

Thank you!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.