0
I am using Logstash version 6.5.4 Data is read and parsed using Filebeat -> Logstash -> Elasticsearch
But as I can see on Kibana for a few cases, I am getting duplicate field data i.e data is converted into an array of duplicate field data separated by a comma .
My message sent from filebeat was:
2024-03-12 13:27:00.126,d85b4ecb-c4b7-4168-bcc1-9b6a3508ce4a,System,NotificationsProcessor,MQ_TO_JOBSERVICE,Tue Mar 12 13:27:00 IST 2024,TENANT,,null,SUCCESSFUL,94,126,'Task completed successfully',TESTING,null,'{"notificationsCount":31}'
{
"_version": 3,
"_source": {
"jobSpecificMetaData": [
"{\"notificationsCount\":31}",
"{\"notificationsCount\":31}",
"{\"notificationsCount\":31}"
],
"current_step_time_ms": 94,
"request_id": [
"d85b4ecb-c4b7-4168-bcc1-9b6a3508ce4a",
"d85b4ecb-c4b7-4168-bcc1-9b6a3508ce4a",
"d85b4ecb-c4b7-4168-bcc1-9b6a3508ce4a"
],
"tenant_mode": [
"TESTING",
"TESTING",
"TESTING"
],
"total_time_ms": 126,
"created_timestamp": [
"2024-03-12 13:27:00.126",
"2024-03-12 13:27:00.126",
"2024-03-12 13:27:00.126"
],
"@timestamp": "2024-03-12T07:57:02.887Z",
"execution_level": [
"TENANT",
"TENANT",
"TENANT"
],
"status": [
"SUCCESSFUL",
"SUCCESSFUL",
"SUCCESSFUL"
]
}
}
I am unable to debug this issue. Kindly help as this is breaking my dashboard and reporting.