hi all,
i have some files in a folder and i want to index them in Elasticsearch using logstash.
some of the files are indexed, but i get a lot of this error i don't know why :
[2022-06-30T16:15:24,966][INFO ][filewatch.readmode.handlers.readfile][myname][aa94017582b122d2c03fec893a6160b466b92da8364dfebc8ac26e9192ea44cb] buffer_extract: a delimiter can't be found in current chunk, maybe there are no more delimiters or the delimiter is incorrect or the text before the delimiter, a 'line', is very large, if this message is logged often try increasing the
file_chunk_size setting. {"delimiter"=>"\n", "read_position"=>0, "bytes_read_count"=>898, "last_known_file_size"=>898, "file_path"=>"/etc/logstash/conf.d/jsonfiles/myname-6294979330da9e2be4578b18.json"}
my logstash configuration :
input {
file {
path => "/etc/logstash/conf.d/jsonfiles/*.json"
mode => "read"
start_position => "beginning"
sincedb_path => "NUL"
codec => multiline {
negate => true
what => "previous"
pattern => '^\{'
max_lines => 10000000
}
type => "json"
file_completed_action => "log_and_delete"
file_completed_log_path => "/etc/logstash/conf.d/jsonfiles/files.log"
}
}
filter {
json {
source => message
}
}
output {
stdout {
codec => rubydebug {
metadata => false
}
}
elasticsearch {
hosts => ["http://localhost:9200"]
index => "eagle_%{[Event][date]}"
document_id => "%{[Event][uuid]}"
}
}
and for some files they are indexed and not deleted from directory.
please can any one help me .