Hello All,
Hoping someone could help please.
I have got a simple ELk stack running and using s3 input plugin to ingest from an endpoint with the following config:
input {
s3 {
bucket => "mybucket"
endpoint => "https://<object_storage_namespace>.compat.objectstorage.<region>.oraclecloud.com"
region => "uk-london-1"
access_key_id => "************"
secret_access_key => "*************"
delete => false
interval => 300 # seconds
add_field => { "service" => "oci" }
codec => "json"
}
}
output {
if [service] == "oci" {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "logstash-oci-%{+YYYY.MM}"
}
}
}
I am not getting any errors in the logstash logs but I am seeing this in Kibana discovery json document.
{
"_index": "logstash-oci-2020.08",
"_type": "_doc",
"_id": "db-j-HMB8Q9kUzB1NjQQ",
"_version": 1,
"_score": null,
"_source": {
"@timestamp": "2020-08-16T18:58:32.170Z",
"message": "}\n",
"service": "oci",
"@version": "1",
"tags": [
"_jsonparsefailure"
]
},
"fields": {
"@timestamp": [
"2020-08-16T18:58:32.170Z"
]
},
"sort": [
1597604312170
]
}
I am using latest ELk Version and s3 input plugin versions.
At the endpoint storage, I have uploaded a file with json logs called samplelog.log but for whatever reason the s3 plugin is not reading the log file correctly. As you can see in the message field, only see "}\n" and getting "_jsonparsefailure" in tags.
Any ideas? Thanks.