Filebeat not sending logs to logstash

elk.home: /tmp/ELK

filebeat.prospectors:

  • input_type: log
    paths:

    • ${elk.home}/LogStashOutputFormatted/SES_VG1/*.csv
      fields:
      log_type: sesvg1
  • input_type: log
    paths:

    • ${elk.home}/LogStashOutputFormatted/SES_VG2/*.csv
      fields:
      log_type: sesvg2

#================================ Outputs =====================================

Configure what outputs to use when sending the data collected by the beat.

Multiple outputs may be used.

#-------------------------- Logstash output ------------------------------

output.logstash:

Array of hosts to connect to.

hosts: ["10.169.95.54:5044"]

#================================ Logging =====================================

Sets log level. The default log level is info.

Available log levels are: critical, error, warning, info, debug

#logging.level: debug

At debug level, you can selectively enable logging only for some components.

To enable all selectors use ["*"]. Examples of other selectors are "beat",

"publish", "service".

Logstash.conf

input {
beats {
port => 5044
}
}
filter {
if [fields][log_type] == "sesvg1" {
csv {
separator => ","
columns => ["nodeName","APIName","Value","Share","Total"]
}
}
else if [fields][log_type] == "sesvg2" {
csv {
separator => ","
columns => ["nodeName","APIName","Value","Share","Total"]
}
}

mutate {
convert => { "nodeName" => "string" }
convert => { "APIName" => "string" }
convert => { "Value" => "string" }
convert => { "Share" => "float" }
convert => { "Total" => "integer" }
}
}

output {
if [fields][log_type] == "sesvg1" {
elasticsearch {
action => "index"
hosts => "http://localhost:9200"
index => "sesvg1newbeat1"
}
} else if [fields][log_type] == "sesvg2" {
elasticsearch {
action => "index"
hosts => "http://localhost:9200"
index => "sesvg2newbeat2"
}
}
}

here are configurations and I got see any indexes getting created in kibana I have tried with single log file i am abele to process and see the data getting populated

additional fields are commented

can you pls share the logs of filebeat ???

Hi,

you should use below format for log files ( Each - is a prospector).

paths:
- /var/log/.log
- /var/log/elasticsearch/

- ${LOG}/filebeat
- c:\programdata\elasticsearch\logs*

Thanks,
Harsh Bajaj

@sandeepnarla22322 Hello, it will also help to see the log from filebeat, if you start filebeat with ./filebeat -v -e -d "*" -c yourconfig.yml, the log should tell us what filebeat see and if it can connect to Logstash.

Thanks Pier,

I have a scenario where I have to process logs from 100 prod nodes at the same time and logs will be rotated every five minutes

I am trying test this scenario by dumping a new file from a new file into the folder every five minutes to see if it is processing the logs but I don't see my logs being processed

###################### Filebeat Configuration Example #########################

This file is an example configuration file highlighting only the most common

options. The filebeat.full.yml file from the same directory contains all the

supported options with more comments. You can use it as a reference.

You can find the full configuration reference here:

Filebeat Reference | Elastic

#=========================== Filebeat prospectors =============================

elk.home: /tmp/ELK

filebeat.prospectors:

  • input_type: log
    ignore_older: 10m
    scan_frequency: 10s

    kafka log files prospector

    paths:
    • ${elk.home}/LogStashOutput/SES_VG1/*.csv
      fields:
      logtype: sesvg1

#================================ Outputs =====================================

Configure what outputs to use when sending the data collected by the beat.

Multiple outputs may be used.

#-------------------------- Logstash output ------------------------------

output.logstash:

Array of hosts to connect to.

hosts: ["10.169.95.54:5044"]

#================================ Logging =====================================

Sets log level. The default log level is info.

Available log levels are: critical, error, warning, info, debug

#logging.level: debug

At debug level, you can selectively enable logging only for some components.

To enable all selectors use ["*"]. Examples of other selectors are "beat",

"publish", "service".

#logging.selectors: ["*"]

Here the config files
my sample logs node1.csv and I have dumped node2.csv after the fifth minute

Thanks for including your configuration, but I want to see the output of the command:

./filebeat -v -e -d "*" -c yourconfig.yml

The command above should dump to STDOUT what Filebeat is doing this will help us diagnose the problem.

thanks it fixed

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.