Shipping CSV file using Filebeat to Logstash and then from logstash to Elasticsearch

hi ,

Im trying to ship csv file using filebeat to logstash and then to elasticsearch but nothing is feeding into logstash or elasticsearch

This is how my filebeat.yml looks like..

filebeat:

List of prospectors to fetch data.

prospectors:
paths:
- /opt/logstash/logs/date_wise2.csv
#- c:\programdata\elasticsearch\logs*
input_type: csv
document_type: date_wise
registry_file: /var/lib/filebeat/registry
output:
logstash:
# The Logstash hosts
hosts: ["localhost:5044"]

# Number of workers per Logstash host.
worker: 1

# Set gzip compression level.
compression_level: 3

and this is how my elkconf.conf looks like..

input {
beats {
host => "localhost"
port => "5044"
}
}
filter
{
if [type] == "date_wise"
{
csv
{
columns => ["Date","Critical","Major","Warning","Minor","Normal","Grand Total"]
separator => ","
skip_empty_columns => "true"
}
mutate
{
convert => [ "Critical" , "integer" ]
convert => [ "Major", "integer" ]
convert => [ "Warning", "integer" ]
convert => [ "Minor", "integer" ]
convert => [ "Normal", "integer" ]
convert => [ "Grand Total", "integer" ]
}

if ([Date] =~ "Date" )
{
drop{}
}
if ( [Date] =~ "CONCATENATE Message" )
{
drop{}
}
if ([Date] =~ "Grand Total" )
{
drop{}
}
date {
match => ["Date","MM/dd/YYYY"]
}
}
}
output
{
if [type] == "date_wise"
{
elasticsearch
{
hosts => "localhost:9200"
index => "date-wise%{+YYYY.MM.dd}"
}
}
stdout { codec => rubydebug }
}

and this is how my logstash logs looks like

cat /var/log/logstash/logstash.log

{:timestamp=>"2016-05-02T10:25:51.641000+0530", :message=>"SIGTERM received. Shutting down the agent.", :level=>:warn}
{:timestamp=>"2016-05-02T10:25:51.643000+0530", :message=>"stopping pipeline", :id=>"main"}
{:timestamp=>"2016-05-02T10:25:52.534000+0530", :message=>"Pipeline main has been shutdown"}
{:timestamp=>"2016-05-02T10:26:10.156000+0530", :message=>"Pipeline main started"}

im not sure where the problem is , but im assuming since i have not specified
start_position => "beginning" , there would be any issues..

thanks

Hi Gaurav,

have you resolved this issue, kindly share it because am also having the same issue.

the pipeline have started fine and the data is not pushed to elastic.

Thanks,
Raj

Hi Gaurav,

First your filebeat.yml file is not correct...
filebeat:
#List of prospectors to fetch data
prospectors:
paths:
- /opt/logstash/logs/date_wise2.csv
#log: Reads every line of the log file (default)
#stdin : Reads the standard in.
#csv input_type doesn't exist
input_type: log
document_type: date_wise
registry_file: /var/lib/filebeat/registry
output:
logstash:
hosts: ["localhost:5044"]

The rest of your configuration seems to be correct.
If you want to do some test over and over again, you have to delete your "registry" file into your /var/lib/filebeat/registry (by default).

Best regards,
Tom