hello!!
I'm reading logs with file input plugin. When I'm reading older logs logstash is performing very good. But while reading live logs it is missing aroung 2% of the logs.
My files are coming through ftp in a solaris 10 server and logstash is also in that server. I'm using ingest pipeline to process the files.
Logstash config:
input {
file { path => "/xpool/home/user/DUMP/*file1" sincedb_path => "/xpool/logstash-6.1.1/logs/sinceDB/sinceDB_file1_ext" id => "file1_input" add_field => { "request_type" => "file1" } exclude => ["*.gz"] max_open_files=>36000 start_position => "beginning"
}
file { path => "/xpool/home/user/DUMP/*file2" sincedb_path => "/xpool/logstash-6.1.1/logs/sinceDB/sinceDB_file2_ext" id => "file2_input" add_field => { "request_type" => "file2"} exclude => ["*.gz"] max_open_files=>36000 start_position => "beginning"
}
}output {
if [request_type] == "file1"
{
elasticsearch {
id => "output_file1"
hosts => "http://10.10.23.180:9200"
pipeline=> "provisioning_file1"
}
}else if [request_type] == "file2"
{
elasticsearch {
id => "output_file2"
hosts => "http://10.10.23.180:9200"
pipeline=> "provisioning_file2"
}
}
}
There are around 10M data for file1 and 30M data for file 2. Also sometimes logstash is reading the file from in between the line.
Example:
original msg:
4,10000000,10000000,1777369722,6.120000,08:59:29,air82,Bus,somenode,20180306 08:59:29,,,,20180306,-1.700000,,53,request_data,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0,OUTPUTCDR_4007_4532_20180306-085922.file1,,,,,,,,,,,,,,,,,,,,,36529660980845502080,
logstash message field:
"message": "_4532_20180306-085922.file1,,,,,,,,,,,,,,,,,,,,,36529660980845502080,"