No data being discovered in Kibana front end

I am installing the entire ELK stack and have been following your getting started guide. My issue is that I am not seeing logs in Kibana, so I tried combing through logs to find out why. I seem to be getting errors between logstash and filebeats. I have been looking around your site to find others with similar issues and I seem to be missing something because I have some of the issues mentioned and when i try the fixes posted, its not working for me. I found the following error in /var/log/logstash/logstash-plain.log (nothing in logstash.log or logstash.err)
[2017-02-06T06:25:14,537][ERROR][logstash.inputs.beats ] Invalid setting for beats input plugin:
so i found a write up you had (Configure Filebeat for All Logs) you mentioned starting up filebeat with -v -d "*" parameters so i did that and I get the following

[ ok ] Starting filebeat (via systemctl): filebeat.service.``` so filebeat seems to be staring up with no issues
when i look at the log in /var/log/filebeat/filebeat i see the following errors
``` 2017-02-06T20:50:13Z ERR Connecting error publishing events (retrying): dial tcp 127.0.0.1:5044: getsockopt: connection refused```  which brought me to this page https://discuss.elastic.co/t/filebeat-cant-talk-to-logstash-with-connection-refused-error/56353/3 and i went over my filebeat.yml and logstash.conf to see if they were configured based on the documentation and they look correct to me
/usr/share/logstash/bin/logstash.conf  ( https://www.elastic.co/guide/en/beats/libbeat/1.2/logstash-installation.html#logstash-setup)
```input {
  beats {
    port => 5044
  }
}
output {
  elasticsearch {
    hosts => "localhost:9200"
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}```
__________________________________________________

/etc/filebeat/filebeat.yml 
(https://www.elastic.co/guide/en/beats/filebeat/5.2/filebeat-configuration.html) and (https://www.elastic.co/guide/en/beats/filebeat/5.2/config-filebeat-logstash.html)

```#=========================== Filebeat prospectors =============================
filebeat.prospectors:
- input_type: log
    - /var/log/*.log
    - /var/log/*/*.log
#----------------------------- Logstash output --------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["127.0.0.1:5044"]```

From this link (https://discuss.elastic.co/t/info-error-publishing-events-retrying-read-tcp/68742/22) i realize i need to provide more information, like versions of the software, and how I installed my software. I did the download and install myself option

Software versions
filebeat version 5.1.2 (amd64), libbeat 5.1.2 
logstash version logstash 5.2.0
ES Version: 2.4.3, Build: d38a34e/2016-12-07T16:28:56Z, JVM: 1.8.0_111

Following information from that same link above it was suggested that i set my java to IPV4 (https://www.elastic.co/guide/en/logstash/current/config-setting-files.html#_settings_files) so i did that and restarted logstash, filebeat, kibana and es.

After restart I still see this in the logstash-plain.log which is complaining about my logstash config file but I have what is in the directions, so not sure what is wrong.

```2017-02-07T15:45:43,997][ERROR][logstash.agent           ] fetched an invalid config {:config=>"input {\n  beats {\n    port => 5044\n    ssl => true\n    ssl_certificate => \"/etc/pki/tls/certs/logstash-forwarder.crt\"\n    ssl_key => \"/etc/pki/tls/private/logstash-forwarder.key\"\n  }\n}\n\nfilter {\n  if [type] == \"syslog\" {\n    grok {\n      match => { \"message\" => \"%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\\[%{POSINT:syslog_pid}\\])?: %{GREEDYDATA:syslog_message}\" }\n      add_field => [ \"received_at\", \"%{@timestamp}\" ]\n      add_field => [ \"received_from\", \"%{host}\" ]\n    }\n    syslog_pri { }\n    date {\n      match => [ \"syslog_timestamp\", \"MMM  d HH:mm:ss\", \"MMM dd HH:mm:ss\" ]\n    }\n  }\n}\n\noutput {\n  elasticsearch {\n    hosts => [\"localhost:9200\"]\n    sniffing => true\n    manage_template => false\n    index => \"%{[@metadata][beat]}-%{+YYYY.MM.dd}\"\n    document_type => \"%{[@metadata][type]}\"\n  }\n}\n\ninput {\n  beats {\n    port => 5044\n    ssl_certificate => \"/etc/ssl/logstash-forwarder.crt\"\n    ssl_key => \"/etc/ssl/logstash-forwarder.key\"\n    congestion_threshold => \"40\"\n  }\n}\n\nfilter {\nif [type] == \"syslog\" {\n    grok {\n      match => { \"message\" => \"%{SYSLOGLINE}\" }\n    }\n\n    date {\nmatch => [ \"timestamp\", \"MMM  d HH:mm:ss\", \"MMM dd HH:mm:ss\" ]\n}\n  }\n\n}\n\noutput {\n  elasticsearch {\n    hosts => \"localhost:9200\"\n    manage_template => false\n    index => \"%{[@metadata][beat]}-%{+YYYY.MM.dd}\"\n    document_type => \"%{[@metadata][type]}\"\n  }\n}\n\n\n\n\n\n\n\n\n", :reason=>"Something is wrong with your configuration."}
[2017-02-07T15:47:11,723][WARN ][logstash.inputs.beats    ] You are using a deprecated config setting "congestion_threshold" set in beats. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. This option is now deprecated since congestion control is done automatically If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"congestion_threshold", :plugin=><LogStash::Inputs::Beats port=>5044, ssl_certificate=>"/etc/ssl/logstash-forwarder.crt", ssl_key=>"/etc/ssl/logstash-forwarder.key", congestion_threshold=>"40", id=>"63fe4a8575981e0aae988853b865f2af1e7f3ab3-6">}
[2017-02-07T15:47:11,726][ERROR][logstash.inputs.beats    ] Invalid setting for beats input plugin:```

and in filebeat log i see this 
```2017-02-07T15:43:50Z INFO Metrics logging every 30s
/var/log/filebeat/filebeat (END)```

Sorry for the long book, I am trying to provide as much information as possible and keep on searching for a fix, Any help or guidance is appreciated. Thank You.

I moved this post from Kibana to Logstash since that's where the issue seems to be (or possibly beats).

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.