Hi Team,
I'm having this issue where I'm fetching data from filebeat to logstash,
but I'm unable to make indices in ES.
here is my config of logstash
input{
beats{
port => "5044"
}
}
filter{
grok {match => {
"message" => '%{IP:clientip},[(?[\w\d/:]+),%{WORD:method}\s%{URIPATHPARAM:request}?%{WORD:type}=(?[\d.]+)\s%{WORD:Protocol}/%{NUMBER:Decimal},%{NUMBER:Response_time}'
}
match => {
"msg" => '%{IP:clientip},[(?[\w\d/:]+),%{WORD:method}\s%{URIPATHPARAM:request}\s%{WORD:type}/%{NUMBER:decimal},%{NUMBER:Response_time}'
}
match => {
"mssg" => '%{IP:clientip},[(?[\w\d/:]+),%{WORD:method}\s%{URIPATHPARAM:request}?%{WORD:null}=%{WORD:User}\s%{WORD:type}/%{NUMBER:decimal},%{NUMBER:Response_time}'
}
}
}
output {
file{
path => "/var/log/logstash/logstest.txt"
}
elasticsearch {
index => "test"
hosts => ["192.168.0.102:9200"]
}
}
and I'm unable to make indices in ES.
I've also made indices before with the same dataset before, but with a different name. but I didn't use grok filter before.
Also my ES status is yellow.
but if i would have any configuration error in logstash it would've showed me.
but my logstash logs are fine.
[2020-02-05T12:23:56,070][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.5.0"}
[2020-02-05T12:24:03,544][INFO ][org.reflections.Reflections] Reflections took 518 ms to scan 1 urls, producing 20 keys and 40 values
[2020-02-05T12:24:06,737][INFO ][logstash.outputs.elasticsearch][grok] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://192.168.0.102:9200/]}}
[2020-02-05T12:24:07,609][WARN ][logstash.outputs.elasticsearch][grok] Restored connection to ES instance {:url=>"http://192.168.0.102:9200/"}
[2020-02-05T12:24:08,047][INFO ][logstash.outputs.elasticsearch][grok] ES Output version determined {:es_version=>7}
[2020-02-05T12:24:08,061][WARN ][logstash.outputs.elasticsearch][grok] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-02-05T12:24:08,313][INFO ][logstash.outputs.elasticsearch][grok] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//192.168.0.102:9200"]}
[2020-02-05T12:24:08,668][INFO ][logstash.outputs.elasticsearch][grok] Using default mapping template
[2020-02-05T12:24:09,066][INFO ][logstash.outputs.elasticsearch][grok] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-02-05T12:24:11,091][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][grok] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2020-02-05T12:24:11,121][INFO ][logstash.javapipeline ][grok] Starting pipeline {:pipeline_id=>"grok", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, "pipeline.sources"=>["/etc/logstash/conf.d/grok.conf"], :thread=>"#<Thread:0x81705fa run>"}
[2020-02-05T12:24:13,579][INFO ][logstash.inputs.beats ][grok] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-02-05T12:24:13,784][INFO ][logstash.javapipeline ][grok] Pipeline started {"pipeline.id"=>"grok"}
[2020-02-05T12:24:14,585][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:grok], :non_running_pipelines=>[]}
[2020-02-05T12:24:15,001][INFO ][org.logstash.beats.Server][grok] Starting server on port: 5044
[2020-02-05T12:24:17,828][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
Any Possible would do great, i don't why it's not making indices suddenly.
Thanks and Regards,
Sagar