How to configure different machine servers log in filebeat and send it to logstash

Hi All,

I'm trying to get logs (not syslog) from different machine servers and want to send those logs to logstash to elasticsearch to kibana. but in kibana I'm getting only system logs not other logs that I'm expecting.
Kindly help asap.

Thanks in Advance

My filebeat.yml file is

#=========================== Filebeat inputs =============================


- type: "log"

  enabled: false

  # Paths that should be crawled and fetched. Glob based paths.
    #- /var/log/*.log
    - /home/admin/Documents/Infrrd/wildfly/standalone/log/*.log
    #- c:\programdata\elasticsearch\logs\*

  #exclude_lines: ['^DBG']

  #include_lines: ['^ERR', '^WARN']

  #exclude_files: ['.gz$']

  #  level: debug
  #  review: 1

  #multiline.pattern: ^\[

  #multiline.negate: false

  #multiline.match: after

#============================= Filebeat modules ===============================

  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: true

  # Period on which files under path should be checked for changes
  reload.period: 10s

#==================== Elasticsearch template setting ==========================

  index.number_of_shards: 3
  #index.codec: best_compression
  #_source.enabled: false

#================================ Outputs =====================================

# Configure what output to use when sending the data collected by the beat.

#-------------------------- Elasticsearch output ------------------------------
  # Array of hosts to connect to.
  #hosts: ["localhost:9200"]

  # Optional protocol and basic auth credentials.
  #protocol: "https"
  #username: "elastic"
  #password: "changeme"

#----------------------------- Logstash output --------------------------------
  # The Logstash hosts
  hosts: ["localhost:5044"]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

> **Configuration file in logstash-conf.conf is is**

input {
  beats {
    port => 5044
filter {
output {
  elasticsearch {
    hosts => ["localhost:9200"]
    manage_template => false
    index => "index-%{+YYYY.MM.dd}"

Hi @Vinit_Kumar

Maybe I'm missing something but you have enabled: false in your config file.

Also, maybe you haven't activated any module. Use filebeat modules list to see enabled and disabled modules. If you want to activate any of them, just use filebeat modules enable [module]

I hope it helps

Hi @Mario_Castro

First of all thank you so much for replying and appreciate your efforts.

I tried using enabled: trued and I enabled modules system, logstash, redis but still it's not working. Is parsing mandatory in logstash if we are using server logs insted of syslog?

The main thing is I'm new to ELK so I don't know where to set the path for different machines server logs in filebeat.

Logstash is not mandatory. What I'll try is to set console output with the paths you need, only to see that everything is working. You should see JSON events being printed in the console.

Input config should be as simple as this. I mean that you shouldn't need more to make it work in its simplest way.

Check permissions too. And also Filebeat won't re-process files that it has already read.

I tried but it's not working.

Just now I installed ELK again from ELK official website documentation.
Do you know how to change the default input path of logs in filebeat.yml

default is /var/log/*.log
other file directory

It's in the link above.

- type: log
    - /var/log/messages
    - /var/log/*.log
    - /my/custom/path

If this is not working, then Filebeat is not using the correct configuration file. Use -c [path-to-configuration-file] when launching Filebeat to set the correct one

@Mario_Castro Thanks for replying again.

dmin@inflap167-HP-EliteBook-840-G2: sudo netstat -ntlp | grep LISTEN

tcp 0 0* LISTEN 1914/java
tcp 0 0* LISTEN 2653/dnsmasq
tcp 0 0* LISTEN 1117/cupsd
tcp 0 0* LISTEN 1914/java
tcp 0 0* LISTEN 6643/node
tcp 0 0* LISTEN 1914/java
tcp 0 0* LISTEN 1788/mongod
tcp 0 0* LISTEN 1548/python
tcp 0 0* LISTEN 1551/mysqld
tcp 0 0* LISTEN 1565/redis-server 1
tcp 0 0* LISTEN 1914/java
tcp6 0 0 :::5044 :::* LISTEN 16234/java
tcp6 0 0 :::* LISTEN 16122/java
tcp6 0 0 :::8983 :::* LISTEN 1918/java
tcp6 0 0 ::1:631 :::* LISTEN 1117/cupsd
tcp6 0 0 :::45657 :::* LISTEN 1287/java
tcp6 0 0 :::* LISTEN 16234/java
tcp6 0 0 :::6627 :::* LISTEN 1760/java
tcp6 0 0 :::9092 :::* LISTEN 1761/java
tcp6 0 0 :::32965 :::* LISTEN 1761/java
tcp6 0 0 :::2181 :::* LISTEN 1287/java
tcp6 0 0 :::9898 :::* LISTEN 1764/java
tcp6 0 0 :::* LISTEN 16122/java

do you have any idea that why 5044 doesn't have any ip? I'm passing hosts as localhost:5044 in filebeat.yml and it logstash configuration file input port as 5044.

but in kibana still I'm getting custome logs.


but in filebeat.yml I'm passing inout as bellow..


  • type: log
    enabled: true
    - /var/log/custom-log.log
    #- /home/admin/ELK/logs/*.log

I was getting connection refused error. It is just because I was using "OpenJDK". And I installed "Oracle JDK " now it's working fine with Oracle jdk with less of configuration.

Special thanks to @Mario_Castro for replaying and supporting.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.