hi ,
Im trying to ship csv file using filebeat to logstash and then to elasticsearch but nothing is feeding into logstash or elasticsearch
This is how my filebeat.yml looks like..
filebeat:
List of prospectors to fetch data.
prospectors:
paths:
- /opt/logstash/logs/date_wise2.csv
#- c:\programdata\elasticsearch\logs*
input_type: csv
document_type: date_wise
registry_file: /var/lib/filebeat/registry
output:
logstash:
# The Logstash hosts
hosts: ["localhost:5044"]
# Number of workers per Logstash host.
worker: 1
# Set gzip compression level.
compression_level: 3
and this is how my elkconf.conf looks like..
input {
beats {
host => "localhost"
port => "5044"
}
}
filter
{
if [type] == "date_wise"
{
csv
{
columns => ["Date","Critical","Major","Warning","Minor","Normal","Grand Total"]
separator => ","
skip_empty_columns => "true"
}
mutate
{
convert => [ "Critical" , "integer" ]
convert => [ "Major", "integer" ]
convert => [ "Warning", "integer" ]
convert => [ "Minor", "integer" ]
convert => [ "Normal", "integer" ]
convert => [ "Grand Total", "integer" ]
}
if ([Date] =~ "Date" )
{
drop{}
}
if ( [Date] =~ "CONCATENATE Message" )
{
drop{}
}
if ([Date] =~ "Grand Total" )
{
drop{}
}
date {
match => ["Date","MM/dd/YYYY"]
}
}
}
output
{
if [type] == "date_wise"
{
elasticsearch
{
hosts => "localhost:9200"
index => "date-wise%{+YYYY.MM.dd}"
}
}
stdout { codec => rubydebug }
}
and this is how my logstash logs looks like
cat /var/log/logstash/logstash.log
{:timestamp=>"2016-05-02T10:25:51.641000+0530", :message=>"SIGTERM received. Shutting down the agent.", :level=>:warn}
{:timestamp=>"2016-05-02T10:25:51.643000+0530", :message=>"stopping pipeline", :id=>"main"}
{:timestamp=>"2016-05-02T10:25:52.534000+0530", :message=>"Pipeline main has been shutdown"}
{:timestamp=>"2016-05-02T10:26:10.156000+0530", :message=>"Pipeline main started"}
im not sure where the problem is , but im assuming since i have not specified
start_position => "beginning" , there would be any issues..
thanks