Urgent help for elasticsearch and logstash demo

Hi,

My CURL XPUT command works fine using its mapping however, i'm having trouble loading the same file from s3 to AWS Elasticsearch. Using logstash conf file (below), dumps the s3 json data file into "message" field in elasticsearch. I need the data to be distributed to mapping fields when I use conf file to transfer s3 data to elasticsearch.

input {
s3 {
bucket => “my bucket”
access_key_id => ""
secret_access_key => ""
region => "us-east-1"
}
}
output {
amazon_es {
hosts => ["https://vpc-xxxxxx.us-east-1.es.amazonaws.com"]
region => "us-east-1"
aws_access_key_id => ''
aws_secret_access_key => ''
index => "dldemo"
document_type => "doc"
}
}

please advise.

1 Like

Never mind .. i found the solution - i have added codec to the s3 as ... codec => json after region=? "us-east-1" in input section..

2 Likes

Are you missing codec's?

https://www.elastic.co/guide/en/logstash/current/plugins-inputs-s3.html#plugins-inputs-s3-codec

And maybe?

https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-codec

Thanks for the help. It works.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.