Hi,
This is my event log,
{'_index': 'logstash-2020.05.03', '_type': '_doc', '_id': 'xXKW2XEB43wd9fLrXQnQ', '_score': 1.0, '_source': {'@version': '1', 'beat': {'version': '6.2.4', 'name': 'cb-rekha.local', 'hostname': 'cb-rekha.local'}, '@timestamp': '2020-05-03T08:10:40.400Z', 'thread': 'Thread-2', 'timestamp': '2020-05-02 22:20:56.155', 'level': 'INFO', 'message': 'Thread 13 is running', 'offset': 4352, 'context': 'default', 'mdc': {'cardNo': '123456789012934', 'domain': 'Mytest-domain', 'ThreadNumber': '6'}, 'source': '/Users/cb-rekha/logging-poc/logs/app1.json', 'logger': 'xxxxxxxx'}}
I'm trying to parse only the "_source" in this JSON and also to remove certain fields under _source and then send it to Elastic search.
I want to post the document in that specific index and document id from the input event so that it would update already existing documents in ES.
Below is my logstash conf file.
input {
kafka {
bootstrap_servers => "localhost:9092"
topics => ["esToKafka"]
}
}
filter {
json {
remove_field => ["@version", "@timestamp"]
source => "message"
target => "myjson"
remove_field => "message"
}
# json {
# source => "_source"
# remove_field => ["@beat", "@version", "@timestamp"]
# }
# prune { whitelist_names => [ "_source" ]
# }
# json_encode {
# source => "_source"
# remove_field => ["_score", "_type"]
# }
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "%{_index}"
document_id => "%{_id}"
user => "admin"
password => "admin"
ssl => true
ssl_certificate_verification => false
ilm_enabled => false
}
stdout { }
}
I tried out different filters but not able to get the source field and the value for index and document_id correctly. Someone please help thanks in advance.