Hi! Basic stuff but cannot get this to work.
I have a directory which contains bunch of json files (and new ones arriving every 10mins). The json files need to be read from dir and outputted to elastic.
I want to read each json file and ouput as a new document in elastic. E.g. file 1234567.json would end up in elastic index "myindex" with document id "1234567". (The doc content should be the whole json from file). And I want to use samplePollTime as timefield in es/kibana.
Logstash current conf:
input {
file {
path => "/data-IN/*"
sincedb_path => "/dev/null"
ignore_older => 0
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["localhost:9200"]
document_id => "%{doc_id}"
index => "myindex"
}
}
But stdout codec however show the json files processed as line by line? Broken into smaller objects with timestamp added into every contentN ?
{
"host" => "myhost",
"path" => "data-IN",
"@timestamp" => 2018-01-13T15:28:18.306Z,
"@version" => "1",
"message" => " "content1": {"
}
So it is breaking the content into smaller objects.
Files are named like epochtimestamp.json and contain:
{
"content1": {
"value": 29.697
},
"content2": {
"ip": "10.10.10.10"
},
"content3": {
"host": "somehost.domain.ex"
},
"samplePollTime": "2018-01-13T14:58:05.352Z",
"notes": "more notes here"
}