Memory issue with logstash?


(sampasei) #1

Hi all,
i've configured logstash in order to take some log files (these files are never updated because are made ony for tests) and write record data on Elastichsearch and Mongo.
Below the conf file
input { file{ path => "/home/test/logs/**/aud*.log" type => "aud" start_position => "beginning" sincedb_path => "/dev/null" } } filter { if [type] == "aud" { csv { separator => " " columns => [ "ts", "aud_data", "aud_app"] } } } output { stdout { } elasticsearch { protocol => "http" cluster => "audc01" } mongodb { collection => "%{type}" database => "adsData" uri => "mongodb://127.0.0.1/" } }
Mongo and Elastichsearch crushes after short time. If mongodb output is removed, i have these debug messages

Shifting current elasticsearch client {:level=>:debug, :file=>"logstash/outputs/elasticsearch.rb", :line=>"572", :method=>"flush"}
Switched current elasticsearch client to #0 at localhost {:level=>:debug, :file=>"logstash/outputs/elasticsearch.rb", :line=>"623", :method=>"shift_client"}

After that Elastchsearch seems blocked.
I'm using logstash 1.5.4 and openjdk 8.
How can avoid this problem?Any suggestion?
Thanks


(Mark Walkom) #2

How do yo know that?

And what do you mean by this?


(system) #3