Logstash not making indices in ES, but everything is fine

Hi Team,

I'm having this issue where I'm fetching data from filebeat to logstash,
but I'm unable to make indices in ES.

here is my config of logstash

input{
beats{
port => "5044"
}
}
filter{
grok {

match => {
"message" => '%{IP:clientip},[(?[\w\d/:]+),%{WORD:method}\s%{URIPATHPARAM:request}?%{WORD:type}=(?[\d.]+)\s%{WORD:Protocol}/%{NUMBER:Decimal},%{NUMBER:Response_time}'
}
match => {
"msg" => '%{IP:clientip},[(?[\w\d/:]+),%{WORD:method}\s%{URIPATHPARAM:request}\s%{WORD:type}/%{NUMBER:decimal},%{NUMBER:Response_time}'
}
match => {
"mssg" => '%{IP:clientip},[(?[\w\d/:]+),%{WORD:method}\s%{URIPATHPARAM:request}?%{WORD:null}=%{WORD:User}\s%{WORD:type}/%{NUMBER:decimal},%{NUMBER:Response_time}'
}
}
}
output {
file{
path => "/var/log/logstash/logstest.txt"
}
elasticsearch {
index => "test"
hosts => ["192.168.0.102:9200"]
}
}

and I'm unable to make indices in ES.

I've also made indices before with the same dataset before, but with a different name. but I didn't use grok filter before.

Also my ES status is yellow.

but if i would have any configuration error in logstash it would've showed me.
but my logstash logs are fine.

[2020-02-05T12:23:56,070][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.5.0"}
[2020-02-05T12:24:03,544][INFO ][org.reflections.Reflections] Reflections took 518 ms to scan 1 urls, producing 20 keys and 40 values 
[2020-02-05T12:24:06,737][INFO ][logstash.outputs.elasticsearch][grok] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://192.168.0.102:9200/]}}
[2020-02-05T12:24:07,609][WARN ][logstash.outputs.elasticsearch][grok] Restored connection to ES instance {:url=>"http://192.168.0.102:9200/"}
[2020-02-05T12:24:08,047][INFO ][logstash.outputs.elasticsearch][grok] ES Output version determined {:es_version=>7}
[2020-02-05T12:24:08,061][WARN ][logstash.outputs.elasticsearch][grok] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-02-05T12:24:08,313][INFO ][logstash.outputs.elasticsearch][grok] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//192.168.0.102:9200"]}
[2020-02-05T12:24:08,668][INFO ][logstash.outputs.elasticsearch][grok] Using default mapping template
[2020-02-05T12:24:09,066][INFO ][logstash.outputs.elasticsearch][grok] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-02-05T12:24:11,091][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][grok] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2020-02-05T12:24:11,121][INFO ][logstash.javapipeline    ][grok] Starting pipeline {:pipeline_id=>"grok", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, "pipeline.sources"=>["/etc/logstash/conf.d/grok.conf"], :thread=>"#<Thread:0x81705fa run>"}
[2020-02-05T12:24:13,579][INFO ][logstash.inputs.beats    ][grok] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-02-05T12:24:13,784][INFO ][logstash.javapipeline    ][grok] Pipeline started {"pipeline.id"=>"grok"}
[2020-02-05T12:24:14,585][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:grok], :non_running_pipelines=>[]}
[2020-02-05T12:24:15,001][INFO ][org.logstash.beats.Server][grok] Starting server on port: 5044
[2020-02-05T12:24:17,828][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Any Possible would do great, i don't why it's not making indices suddenly.

Thanks and Regards,
Sagar

Hi @Sagar_Mandal - It is a bit unclear with what you previously shared if you are actually receiving the Filebeat data in your Logstash instance. Did you check for any errors in Filebeat?

if you see the image attached of ES indices, you'll see weblog indices in the end, earlier I was able to fetch data from filebeat.

currently i have deleted filebeat logs but when i restart file beat it should again make its logs but it isn't make any.

earlier filebeat logs weren't showing any errors to me.

I really don't know why it isn't making any indices

@Sagar_Mandal

  1. This could be because of the state stored in the registry. If you wish Filebeat to (re)-process the files again, you may want to delete the registry files and start Filebeat.

  2. If this does not work, we will need to look into your Filebeat configuration files as well as the debug logs.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.