Logstash output not working when using logstash-s3-plugin

Configuration with input file from local file system writes to output file and output elasticsearch as expected. When using logtash-input-s3 input{s3{ etc }} the output to elasticsearch and file only write one of the files/output from the s3 bucket. I have run the configuration in --debug mode and see that all the files in the s3 bucket with the given prefix are in fact parsed and processed but I only get one line in the debug trace that indicates "received output" and only one line much later in the trace that the output event is written to file (same with elasticsearch config).

It appears that what ever triggers the event to write is not being called, not sure why the s3 plugin would have anything to do with this . I have tried with logstash 5.2.1 and 5.4.x and logstash-input-s3 v 3.1.2 and 3.1.4.

Any good reason that would happen?

Thanks!

Please show your config.

Input{
S3{
Region=>"myRegion"
Bucket=>"myBucket"
Prefix=>"my prefix"
Codec=>multiline{
Pattern=>"mypattern"
What=>"previous"
Negate=>"true"
Auto_flush_interval=>1
}
Sincedb_path=>"/dev/null"
}
}
Filter{
Xml{
Store_xml=>false
Source=>"message"
Remove_namespace=>true
Force_array=>false
Xpath=>[....works]
Suppress_empty=>true
}
Mutate{
Remove_field=>["fld1"]
Replace=>{"myfld"=> ....this works}
}
}
Output{
File{
Path=>"/path/my file.txt"
}
}

Please ignore the case...
Input from file with same filters output as expected.

Withdrawn...s3 content was not the same as local filesystem content and were parsed as specified.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.