Dear Community,
I have a Custom Log integration active on my ELK Stack which get a file located in a specific path of my server where Elastic Agent is installed.
Here an example of the source of data :
08DCC572EA626F63 2024-08-29T08:06:57.359Z 2024-08-29T08:06:57.361Z 1.1.1.1 25 0 1.1.1.1:62748 127.0.0.1 empty 1 SERVER Default Frontend SERVER
08DCC572EA626F64 2024-08-29T08:07:04.343Z 2024-08-29T08:07:04.344Z 1.1.1.1 587 0 1.1.1.1:62894 127.0.0.1 empty 1 SERVER Client Frontend SERVER
Here the processors of my custom pipeline for this datastream
[
{
"grok": {
"field": "message",
"patterns": [
"%{WORD:Transaction} %{TIMESTAMP_ISO8601:Start_Transaction} %{TIMESTAMP_ISO8601:End_Transaction} %{IP:ServerIP} %{INT:ServerPort} %{INT:LoginState} %{HOSTPORT:ProxyServer} %{IP:RemoteIP} %{HTTPDUSER:UserName} %{INT:State} %{HOSTNAME:ServerName} %{GREEDYDATA:Connector}"
]
}
},
{
"set": {
"field": "Exchange.Protocol",
"value": "SMTP"
}
}
]
Unfortunately the fields Start_Transaction and End_Transaction are stored as string and not date.
If i am not wrong ISO8601 is understood as a format for a date type , why my fields are converted as strings ?
My Stack ELK is at the version 8.15.
Best Regards, Edouard Fazenda.