Logstash s3 plugin to push json files to elasticsearch

Usecase

Push s3 JSON objects (s3 containers json files), each object will be one document)

I have written a pipeline

input {
    s3 {
        access_key_id => "MY_KEY"
        secret_access_key => "MY_SECRET"
        bucket => "sthreetoes"
        region => "ap-south-1"
        codec => "json"
    }
}
output {
    elasticsearch {
        hosts => "http://elasticsearch:9200"
        index => "candidate-data"
    }
}

When I run the pipeline, losgstash is shipping to index considering each key-value paid as document.

json object in s3

{
    "id": 246,
    "first_name": "Lell",
    "last_name": "Bsel",
    "email": "lbros6t@sakura.jp",
    "gender": "Female",
    "ip_address": "12.12.12.8"
}

When we run the logstash pipeline index is created with 6 documents considering each key-value pair in JSON as a message.

Any solution to save the JSON object as a single document

It sounds like you need a multiline codec on the input.

Yes, we are going with one line will not help pickup the whole json.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.