I'm colleting logs from a file .log:
{"type":"audit_entry","created":"5/20/2021, 11:12:42 PM","colaborador_id":"cf7dc62b-dde9-4980-89d8-96eb5707876e","request_method":"PUT","ajax":false,"route":"/stock/artigos/8c443bfe-d077-46d2-805b-948c15534f2c","protocol":"https"}
But I don't know how to catch all these fields. I put them in the elasticsearch and the JSON file is:
{
"_index": "prod",
"_type": "_doc",
"_id": "8pWIpHkBSNn6S0YNSeBX",
"_score": 1,
"_source": {
"@version": "1",
"log": {
"file": {
"path": "/usr/share/prod/action.log"
},
"offset": 380990
},
"ecs": {
"version": "1.6.0"
},
"agent": {
"name": "ee0ccbef92f1",
"type": "filebeat",
"version": "7.11.0",
"hostname": "ee0ccbef92f1",
"ephemeral_id": "29f3e0e3-13be-40ac-abd9-35b688a5a177",
"id": "5967388f-6559-4b97-87c7-79bb3b1a2179"
},
"message": "{\"type\":\"audit_entry\",\"created\":\"5/25/2021, 5:17:40 PM\",\"colaborador_id\":\"af4a57b5-c05f-446a-be65-97143d4629a5\",\"ip\":\"192.168.112.6\",\"request_method\":\"PUT\",\"ajax\":false,\"route\":\"/relacionamento/agendamento/c8492b33-4167-4466-abdd-d6729e38df2f\",\"protocol\":\"https\"}",
"tags": [
"beats_input_codec_plain_applied"
],
"@timestamp": "2021-05-25T17:17:42.678Z",
"input": {
"type": "log"
},
"host": {
"name": "ee0ccbef92f1"
}
},
"fields": {
"@timestamp": [
"2021-05-25T17:17:42.678Z"
]
}
}
How I can structure the message field to catch all the fields is in the message?