Hi,
I'm a new user from Elastic and I'm trying to parse and index a json file using logstash, but so far I was able to do it but the fields were not indexed as expected - for example - all my fields were not indexed - I can only search the data as a String search.
Could someone help me and let me know what I'm probably doing wrong?
Each object from my JSON file is a single line (my file has thousands of objects). See an example:
{"_index":"test","_type":"log","_id":"jfdsjhadhjsdddshjkl","_score":1,"_source":{"@timestamp":"2019-01-07T02:02:21.567Z","beat":{"hostname":"ip-111-11-11-111","name":"ip-111-11-11-111","version":"1.0.0"},"field1":"test","field2":"test",..."fieldx":"test","fieldxy":{"__cachedRelations":{},"__data":{"field1":"test","field2":"test",..."fieldn":"test"},"__persisted":false,"__strict":false,"field1":"test"}],"source":"test","type":"log"}}
My Logstash conf:
input {
stdin {
type => "stdin-type"
}
file {
path => ["mypath/myfile.json"]
sincedb_path => "nul"
start_position => "beginning"
}
}
filter {
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
}
I also tried to insert a filter - json { source => "message" - but I started to face a lot of mapping issues (_id, _type and _index metadata fields) that I could not find a way to solve.
Thanks