I just set up Elastic Search, Kibana and Logstash.
I am trying to load up the historical apache log to Elastic Search via Logstash.
I have the below configuration for Logstash, however, it seems to stop after showing the message below. Elastic Search is working since I can PUT/GET and DELETE data from Sense.
Questions are:
(1) How do I confirm my Logstash is loading data to Elastic Search? (No apparent log on ES log right now.)
(2) What's wrong with my configuration that possibly prevents loading data?
`// Last Message in log for LogStash
{:timestamp=>"2016-08-09T18:53:08.436000+0900", :message=>"Grok compile", :field=>"message", :patterns=>[], :level=>:info}
{:timestamp=>"2016-08-09T18:53:08.627000+0900", :message=>"Starting pipeline", :id=>"main", :pipeline_workers=>16, :batch_size=>125, :batch_delay=>5, :max_inflight=>8000, :level=>:info}
{:timestamp=>"2016-08-09T18:53:08.648000+0900", :message=>"Pipeline main started"}
// Start up command
logstash hisaotsu$ bin/logstash -f ./ls-apache.conf --verbose --log ls.log -w 16
Sending logstash logs to ls.log.
// Configuration File
logstash hisaotsu$ cat ls-apache.conf
input{
file{
path => "/mydir/es/log_dropbox/access_log_*"
start_position => "beginning"
ignore_older => 0
sincedb_path => "/mydir/mydir/es/"
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
break_on_match => false
tag_on_failure => ["_message_parse_failure"]
}
date {
match => ["timestamp", "dd/MMM/YYYY:HH:mm:ss Z"]
locale => en
}
geoip {
source => ["clientip"]
}
grok {
match => { "request" => "^/%{WORD:first_path}/%{GREEDYDATA}$" }
tag_on_failure => ["_request_parse_failure"]
}
useragent {
source => "agent"
target => "useragent"
}
}
output {
elasticsearch {}
stdout { codec => rubydebug }
}
`