Logstash S3 Input Plugin Error

Hello All,

Hoping someone could help please.

I have got a simple ELk stack running and using s3 input plugin to ingest from an endpoint with the following config:

input {
  s3 {
    bucket => "mybucket"
    endpoint => "https://<object_storage_namespace>.compat.objectstorage.<region>.oraclecloud.com"
    region => "uk-london-1"
    access_key_id => "************"
    secret_access_key => "*************"
    delete => false
    interval => 300 # seconds
    add_field => { "service" => "oci" }
    codec => "json"
output {
  if [service] == "oci" {
    elasticsearch {
      hosts => ["http://localhost:9200"]
      index => "logstash-oci-%{+YYYY.MM}"

I am not getting any errors in the logstash logs but I am seeing this in Kibana discovery json document.

  "_index": "logstash-oci-2020.08",
  "_type": "_doc",
  "_id": "db-j-HMB8Q9kUzB1NjQQ",
  "_version": 1,
  "_score": null,
  "_source": {
    "@timestamp": "2020-08-16T18:58:32.170Z",
    "message": "}\n",
    "service": "oci",
    "@version": "1",
    "tags": [
  "fields": {
    "@timestamp": [
  "sort": [

I am using latest ELk Version and s3 input plugin versions.
At the endpoint storage, I have uploaded a file with json logs called samplelog.log but for whatever reason the s3 plugin is not reading the log file correctly. As you can see in the message field, only see "}\n" and getting "_jsonparsefailure" in tags.

Any ideas? Thanks.

Okay with further testing, I have made some changes as described below:

  1. Uploaded to the endpoint storage with another file called example2.json with the contents:
    "fruit": "Apple",
    "size": "Large",
    "color": "Red"
  1. Restarted logstash and now seeing the errors in logstash logs:

[ERROR][logstash.codecs.json ][main][cdeec7eab53d0d27aac4410853599e0063c9bb4baa4f5d84ff5faf279c170855] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unexpected close marker '}': expected ']' (for root starting at [Source: (String)"}"; line: 1, column: 0])
at [Source: (String)"}"; line: 1, column: 2]>, :data=>"}"}

Though format of file is in json, the logstash plugin still complains it is not in json. :frowning:

It looks like your files are pretty-printed JSON. An s3 input, like a file input, consumes the file one line at a time. So that example will be five events '{',
'"fruit": "Apple",', '"size": "Large",', '"color": "Red"', and '}', none of which are valid JSON. You may be able to combine the parts of a single JSON object using a multiline codec.

1 Like

Ahh thanks Badger.

So in this case if I put the json content in one line, the s3 plugin should read it accordingly and display in correctly in Kibana?

I would expect so, yes.

Perfect that worked. Thank you!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.