Hello,
I have a problem with json events that importing into elasticsearch from logstash.
events come from mongodb (with jdbc input plugin) and then imported into elasticsearch.
after importing events into logstash the json structure is like this json document:
{
"_index" : "bb_purchaselog_2021.08",
"_type" : "_doc",
"_id" : "IVYXeHsBHLpkNlM2tct_",
"_score" : 1.0,
"_source" : {
"document" : {
"user_trophy" : 1068.0,
"transaction_revenue" : 200000.0,
"user_name" : "Yah",
"user_coin" : 635.0,
"transaction_category" : "Direct",
"user_level" : 7.0,
"transaction_state" : "Valid",
"@timestamp" : "2021-08-24T12:06:33.356Z",
"user_xp" : 1345.0,
"product_name" : "P3",
"transaction_id" : "D9D0C400CC3A80D2E3CEE86C5C9D7E6B",
"user_id" : "60e1e20c447ee7267be7e368",
"user_gem" : 315.0,
"transaction_time" : 1629806793356
},
"@timestamp" : "2021-08-24T12:17:04.006Z",
"type" : "bb_purchaselog"
}
},
But logstash defines the root for my json documents as "** document ", which I want to delete and put the root of the documents to " _ source **", like this structure:
{
"_index" : "bb_purchaselog_2021.08",
"_type" : "_doc",
"_id" : "IVYXeHsBHLpkNlM2tct_",
"_score" : 1.0,
"_source" : {
"user_trophy" : 1068.0,
"transaction_revenue" : 200000.0,
"user_name" : "Yah",
"user_coin" : 635.0,
"transaction_category" : "Direct",
"user_level" : 7.0,
"transaction_state" : "Valid",
"@timestamp" : "2021-08-24T12:06:33.356Z",
"user_xp" : 1345.0,
"product_name" : "P3",
"transaction_id" : "D9D0C400CC3A80D2E3CEE86C5C9D7E6B",
"user_id" : "60e1e20c447ee7267be7e368",
"user_gem" : 315.0,
"transaction_time" : 1629806793356
"@timestamp" : "2021-08-24T12:17:04.006Z",
"type" : "bb_purchaselog"
}
},
How can I do this with filter stage plugins such as json or other plugins?