Elastic search stops recording data after reaching a certain count

In Elasticsearch data is added till a certain count and then it stops processing after that point.

I suspect you will need to provide more details in order for anyone to be able to help you with this.

I'm opening logstash from java to write data into elasticsearch
This is my conf file:

input {
file {
path => "D:\logstash\logstash-6.1.0\bin\repoprolog.log"
start_position => "beginning"
codec => multiline
{
pattern => "%{DATE}_%{TIME} - "
negate => true
what => "previous"
auto_flush_interval => 1
}
sincedb_path => "/dev/null"
ignore_older => 0
}
}

filter {
grok {
match => {"message" => "%{DATE}_%{TIME} - %{POSINT:value} %{DATA:status} %{WORD:result} %{GREEDYDATA:final}"}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
stdout {
codec => rubydebug
}
}

this is my logstash log:

[2018-03-14T13:10:58,562][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"D:/logstash/logstash-6.1.0/modules/fb_apache/configuration"}
[2018-03-14T13:10:58,577][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"D:/logstash/logstash-6.1.0/modules/netflow/configuration"}
[2018-03-14T13:10:58,749][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-03-14T13:10:59,171][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.1.0"}
[2018-03-14T13:10:59,619][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-03-14T13:11:03,103][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-03-14T13:11:03,103][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-03-14T13:11:03,259][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-03-14T13:11:03,306][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>nil}
[2018-03-14T13:11:03,322][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-03-14T13:11:03,322][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "omit_norms"=>true}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"format"=>"disabled"}}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"format"=>"disabled"}, "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed", "doc_values"=>true, "ignore_above"=>256}}}}}, {"float_fields"=>{"match"=>"", "match_mapping_type"=>"float", "mapping"=>{"type"=>"float", "doc_values"=>true}}}, {"double_fields"=>{"match"=>"", "match_mapping_type"=>"double", "mapping"=>{"type"=>"double", "doc_values"=>true}}}, {"byte_fields"=>{"match"=>"", "match_mapping_type"=>"byte", "mapping"=>{"type"=>"byte", "doc_values"=>true}}}, {"short_fields"=>{"match"=>"", "match_mapping_type"=>"short", "mapping"=>{"type"=>"short", "doc_values"=>true}}}, {"integer_fields"=>{"match"=>"", "match_mapping_type"=>"integer", "mapping"=>{"type"=>"integer", "doc_values"=>true}}}, {"long_fields"=>{"match"=>"", "match_mapping_type"=>"long", "mapping"=>{"type"=>"long", "doc_values"=>true}}}, {"date_fields"=>{"match"=>"", "match_mapping_type"=>"date", "mapping"=>{"type"=>"date", "doc_values"=>true}}}, {"geo_point_fields"=>{"match"=>"", "match_mapping_type"=>"geo_point", "mapping"=>{"type"=>"geo_point", "doc_values"=>true}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "doc_values"=>true}, "@version"=>{"type"=>"string", "index"=>"not_analyzed", "doc_values"=>true}, "geoip"=>{"type"=>"object", "dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip", "doc_values"=>true}, "location"=>{"type"=>"geo_point", "doc_values"=>true}, "latitude"=>{"type"=>"float", "doc_values"=>true}, "longitude"=>{"type"=>"float", "doc_values"=>true}}}}}}}}
[2018-03-14T13:11:03,353][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-03-14T13:11:03,447][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x5329b4bf run>"}

pipeline is not started i guess and some data is stored in elasticsearch (in my case 1096 records are there but in elasticsearch only 16 are stored
health status index pri rep docs.count docs.deleted store.size pri.store.size
yellow open logstash-2018.03.14 1 1 16 0 33.4kb 33.4kb
).plz help

I believe this path should be set to "NUL" on Windows.

I've changed it . But this didn't solve my problem

Might this setting make Logstash ignore files? What happens if you remove this or set it to a larger value?

I tried removing this ,that didn't solve the problem.
I tried setting ignore_older => 10 i didn't get any response in elasticsearch

What is t he age of the files you are trying to read? Would 10 seconds be an appropriate value to catch them?

The problem is with logstash i guess . As you can see the logstash log file it stops at starting pipeline step i.e pipeline is not created.

The next step is continued after some delay like 5 minutes . I don't think the issue is with sinceDB

Are there any errors reported in the Elasticsearch logs?

No errors in elasticsearch.
I tried writing the data into a file instead of es .There also only partial data was written

Enable debug logging in Logstash and see if that provides any additional information.

Tried debugging but that didn't help much.

Should i make any changes with jvm.options file considering the fact that i'm getting this issue only when i try to run from java

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.