Delimiter error when using Logstash File Input Plugin. Files not ingested fully

I'm using the Logstash File Input Plugin to ingest a a file but I am receiving this error:

[[main]<file] readfile - buffer_extract: a delimiter can't be found in current chunk, maybe there are no more delimiters or the delimiter is incorrect or the text before the delimiter, a 'line', is very large, if this message is logged often try increasing the file_chunk_size setting. {"delimiter"=>"\n", "read_position"=>0, "bytes_read_count"=>2661945, "last_known_file_size"=>2661945, "file_path"=>"/Desktop/test.json"}

I'm not sure if it is due to the above error, but Logstash is hence unable to ingest all of the events in the file. I have tried many variations of the config but to no avail. Below is my config:

input {
	file {
		path => [ "/Desktop/test.json" ]
	    mode => "read"
		codec => "json"
		file_completed_action => "log"
		file_completed_log_path => "/Desktop/logstash_completed_files.txt"
		sincedb_path => "/dev/null"
		file_chunk_size => 104857600
		file_chunk_count => 100
	}
}

Anyone able to explain what the delimiter error means and how to fix it? Thanks!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.