Logstash UDP input ingest big payloads

Hi. I have a simple Logstash pipeline:

input {
	udp {
		port => 50001
		codec => "json"
	}
}

output {
	elasticsearch {
		// elastic data
	}
}

I observed that when I submit a bigger payload it does not get ingested. Is there any way to tackle this? I have not been able to determine the current size, but when I submit a couple of KBs I get some logs that it was not ingested.

I would like to send messages that are a couple of MB in size. Would that be feasible?

There is a buffer limit.
However, I would suggest to use the ruby debug for more details.

output {
	elasticsearch {
		// elastic data
	}
    stdout {codec => rubydebug{} }
}