Hi,
Im having a peculiar problem where logstash is not importing the data. Im running logstash in the following mode sudo /opt/logstash/bin/logstash agent --debug -f conf.d/.
It discovers the files fine:
each: file grew: /tmp/callhome/flattend1.json: old size 0, new size 1299 {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"235", :method=>"each"}
each: file grew: /tmp/callhome/flattend_2016-03-11_07-18-04-550.json: old size 0, new size 1897 {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"235", :method=>"each"}
each: file grew: /tmp/callhome/flattend_2016-03-11_07-27-19-533.json: old size 0, new size 1896 {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"235", :method=>"each"}
each: file grew: /tmp/callhome/flattend_2016-03-11_07-09-32-300.json: old size 0, new size 1886 {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"235", :method=>"each"}
each: file grew: /tmp/callhome/flattend_2016-03-11_07-18-03-671.json: old size 0, new size 1897 {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"235", :method=>"each"}
each: file grew: /tmp/callhome/flattend_2016-03-11_06-02-21-672.json: old size 0, new size 1897 {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"235", :method=>"each"}
But other than above logs, all I see is:
Flushing buffer at interval {:instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x47626b24 @operations_mutex=#, @max_size=500, @operations_lock=#, @submit_proc=#, @logger=#<Cabin::Channel:0x395af884 @metrics=#<Cabin::Metrics:0x24dfc75b @metrics_lock=#, @metrics={}, @channel=#>, @subscriber_lock=#, @level=:debug, @subscribers={12858=>#<Cabin::Outputs::IO:0x3fdb8afd @io=#, @lock=#>}, @data={}>, @last_flush=2016-04-05 01:08:43 -0700, @flush_interval=1, @stopping=#, @buffer=[], @flush_thread=#>", :interval=>1, :level=>:debug, :file=>"logstash/outputs/elasticsearch/buffer.rb", :line=>"90", :method=>"interval_flush"}
Flushing buffer at interval {:instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x20b66579 @operations_mutex=#, @max_size=500, @operations_lock=#, @submit_proc=#, @logger=#<Cabin::Channel:0x395af884 @metrics=#<Cabin::Metrics:0x24dfc75b @metrics_lock=#, @metrics={}, @channel=#>, @subscriber_lock=#, @level=:debug, @subscribers={12858=>#<Cabin::Outputs::IO:0x3fdb8afd @io=#, @lock=#>}, @data={}>, @last_flush=2016-04-05 01:08:43 -0700, @flush_interval=1, @stopping=#, @buffer=[], @flush_thread=#>", :interval=>1, :level=>:debug, :file=>"logstash/outputs/elasticsearch/buffer.rb", :line=>"90", :method=>"interval_flush"}
I have confirmed elastic search is up and running. I have been able to import historical data for type syslog, glog and node_json_file. Although node_json_file seems to be finicky as well.
I have tried removing since_db files, restarting logstash and the VM running the service with no luck.
My config looks like:
$cat 02-beats-input.conf
input {
file {
ignore_older => 0
exclude => "/tmp/var/log/*.gz"
path => "/tmp/var/log/messages-*"
path => "/tmp/var/log/messages_latest"
path => "/tmp/var/log/messages"
type => "syslog"
start_position => "beginning"
}
}
input {
file {
ignore_older => 0
exclude => "/tmp/var/log/datera/*.gz"
path => "/tmp/var/log/datera/*"
type => "glog"
start_position => "beginning"
}
}
input {
file {
ignore_older => 0
start_position => "beginning"
type => "node_json_file"
codec => json
path => "/tmp/var/*.json"
}
}
input {
file {
ignore_older => 0
start_position => "beginning"
type => "callhome_notification_json"
codec => json
path => "/tmp/callhome/flattend*.json"
}
}
$ cat 13-callhome-notification-filter.conf
filter {
if [type] == "callhome_notification_json" {
date {
match => [ "notification_info_fault_timeStamp", "UNIX"]
}
}
}
$cat 30-elasticsearch-output.conf
output {
elasticsearch {
hosts => ["127.0.0.1:9200"]
index => "logcollect-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}
Would appreciate any help!
Thanks!