How to increase max message size?

I am using below logstash.conf. I am using all default settings. The max size of http requests that can be processed correctly is around 130 kb. If the request is larger than 130kb, LogStash does not output to stdout, and does not return an http response.

How to increase the message size that can be processed?

input {
http {
port => 5544
codec => "json"
}
}
filter {
ruby {
code => "event['@metadata']['computed_id'] = event['[LogMessage][Header][MessageId]']"
}
date {
match => [ "[LogMessage][Header][CreatedDateTime]", "ISO8601" ]
}
mutate {
remove_field => [ "host", "headers" ]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "<applog-{now/d}>"
document_id => "%{[@metadata][computed_id]}"
}
stdout { codec => rubydebug }
}

Can you try running in debug to see what happens?

I sent a small message at 09:21:47, and I see below entries in LogStash logs. I sent a big message at 09:23:04, and there is nothing in the logs, only "Flushing buffer at interval".

{:timestamp=>"2016-03-02T09:21:47.740000-0600", :message=>"Flushing buffer at interval", :instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x48de003c @operations_mutex=#Mutex:0x4619e757, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::Re

{:timestamp=>"2016-03-02T09:21:47.795000-0600", :message=>"Flushing buffer at interval", :instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x225c556a @operations_mutex=#Mutex:0x2c9bbf51, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::Re

{:timestamp=>"2016-03-02T09:21:47.795000-0600", :message=>"Flushing buffer at interval", :instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0xc6c93c1 @operations_mutex=#Mutex:0x2ab6c3c, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::Reen

{:timestamp=>"2016-03-02T09:21:47.836000-0600", :message=>"filter received", :event=>{"LogMessage"=>{"Header"=>{"Source"=>"BPM", "CreatedDateTime"=>"2016-03-01T19:03:31.032Z", "MessageType"=>"Event", "MessageName"=>"CustomerCaseGraphDataService", "MessageVersio

{:timestamp=>"2016-03-02T09:21:47.841000-0600", :message=>"Date filter: received event", :type=>nil, :level=>:debug, :file=>"/logstash-2.2.2/vendor/bundle/jruby/1.9/gems/logstash-filter-date-2.1.2/lib/logstash/filters/date.rb", :line=>"229", :method=>"filter"}

{:timestamp=>"2016-03-02T09:21:47.842000-0600", :message=>"Date filter looking for field", :type=>nil, :field=>"[LogMessage][Header][CreatedDateTime]", :level=>:debug, :file=>"/logstash-2.2.2/vendor/bundle/jruby/1.9/gems/logstash-filter-date-2.1.2/lib/logstash/filters/date.rb", :line=>"232", :method=>"filter"}

{:timestamp=>"2016-03-01T19:03:31.032Z", :message=>"Date parsing done", :value=>"2016-03-01T19:03:31.032Z", :level=>:debug, :file=>"/logstash-2.2.2/vendor/bundle/jruby/1.9/gems/logstash-filter-date-2.1.2/lib/logstash/filters/date.rb", :line=>"266", :method=>"filter"}

{:timestamp=>"2016-03-02T09:21:47.844000-0600", :message=>"filters/LogStash::Filters::Mutate: removing field", :field=>"host", :level=>:debug, :file=>"/logstash-2.2.2/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/filters/base.rb", :line=>"175", :method=>"filter_matched"}

{:timestamp=>"2016-03-02T09:21:47.845000-0600", :message=>"filters/LogStash::Filters::Mutate: removing field", :field=>"headers", :level=>:debug, :file=>"/logstash-2.2.2/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/filters/base.rb", :line=>"175", :method=>"filter_matched"}

{:timestamp=>"2016-03-02T09:21:47.846000-0600", :message=>"output received", :event=>{"LogMessage"=>{"Header"=>{"Source"=>"BPM", "CreatedDateTime"=>"2016-03-01T19:03:31.032Z", "MessageType"=>"Event", "MessageName"=>"CustomerCaseGraphDataService", "MessageVersio

{:timestamp=>"2016-03-02T09:21:48.140000-0600", :message=>"Flushing buffer at interval", :instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x7f1e40ab @operations_mutex=#Mutex:0x71d9a138, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::Re

{:timestamp=>"2016-03-02T09:21:48.202000-0600", :message=>"Flushing buffer at interval", :instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x31332f @operations_mutex=#Mutex:0xef6dd59, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::Reent

{:timestamp=>"2016-03-02T09:21:48.278000-0600", :message=>"Flushing buffer at interval", :instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x4fde0221 @operations_mutex=#Mutex:0x4119e5aa, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::Re

Is that running with --debug?

Yes, it is. These entries only show with --debug: :level=>:debug

I figured this out. This is caused by a jruby bug that does not return tmpdir. Somehow I never got the exception in my dos window.

This error was reported and resolved at this url:

Replacing vendor/jruby with below release solved the problem: https://s3.amazonaws.com/jruby.org/downloads/1.7.24/jruby-bin-1.7.24.zip