Increase Size Limit of logs

Hey there,

Is it possible to increase the size limit of incoming logs to logstash? I am using filebeat but logstash will break up large logs which then makes the grok filter unable to parse it. I am trying to log some big xml response bodies so being able to increase the limit for logstash would be great! Thanks!

Hello @AlecBruns,

By default Logstash should not break anything that was sent to him by Filebeat, but it will send 1 event per line in your log, is your XML document is made of multiple lines?

Can you add log sample to this thread and your filebeat configuration?

Hi @pierhugues , I cannot share log since it is sensitive info but the log is on one line. File beat is picking up the log that is being made by nginx. As far as I know nginx doesn't log to multiple lines. The log format for nginx is:

log_format main '$remote_addr $status $request_time $upstream_header_time $upstream_response_time'
    ' $request  $body_bytes_sent '
     '$http_user_agent  $soap_action  $request_body';

This is my filebeat conf:

filebeat.prospectors:
- input_type: log
  paths:
    - /var/log/nginx/*.log
  document_type: nginx-access
output:
  logstash:
    enabled: true
    hosts:

Thank you

I cannot find anything that would make LS break large log, maybe you see newline in the soap xml object?

Instead of using grok to parse the content, did you try to use the logstash-filter-xml , You can use xpath to extract the data from the xml.

Hmm, I don't believe it is being split into multiple lines but I can investigate it further.

However, the logs have extra meta data from nginx so the xml filter can't be used to filter all the data.

Thank you for all the help thus far though.

Maybe writing the events to a file using the file output will give us some idea what is going on here.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.