hi,
I have a strange case regarding my Logstash process.
I'm having a filebeat which sends file-log to Logstash. That file log is updated from old proxy system in the following manner:
logs from 15 minutes are gathered and sent to File Server -> on that server is filebeat which reads data and sent data to logstash. After parsing, logs are sent to ES-Cluster.
Here is an example of my text to parsing (that's old proxy access log):
2023/05/15|10:32:56|Greedy Text
Here is my input on Logstash:
input {
beats {
port => "24562"
}
}
here is my logstash filter
filter {
grok {
match => { "message" =>"(?<Date>[^|]*)\|%{TIME:Timer}\|%{GREEDYDATA}" }
#after match I'm having fields: Date: "2023/06/20", Timer: "23:24:34" and "message"
break_on_match => false
}
mutate {
# here I'm combining date and time together. After that my new field has following structure: "2023/06/20 23:24:34"
add_field => { "Event_date" => "%{Date} %{Timer}" }
}
#then I want add new field (TimeStamp) with date which translate Event_date (where is my time-Zone mark) to UTC (pure Format for ES)
date {
match => ["Event_date", "yyyy/MM/dd HH:mm:ss"]
timezone => "Europe/Berlin"
target => "TimeStamp"
add_field => { "debug" => "TimeStampworks"} # that is control filed - will be in production deleted
}
}
so far so good.
TEST1 (OK): manually started logstash and output to console ( stdout { codec => rubydebug }):
{
"TimeStamp" => 2023-06-20T21:24:34.000Z,
"message" => "2023/06/20|23:24:34|MY Text,
"Event_date" => "2023/06/20 23:24:34",
"debug" => "TimeStampworks",
@timestamp" => 2023-06-26T06:25:30.607Z
}
TEST2 (OK): manually started logstash and output to console and send to Elasticsearch ( stdout { codec => rubydebug }):
in Console:
{
"TimeStamp" => 2023-06-20T21:24:34.000Z,
"message" => "2023/06/20|23:24:34|MY Text
"Event_date" => "2023/06/20 23:24:34",
"debug" => "TimeStampworks",
@timestamp" => 2023-06-26T06:25:30.607Z
}
in Kibana (OK): and I can filter by TimeStamp:
TEST3: logstash started as daemon: logs are in Kibana, but without TimeStemp field and without debug field.
Here are my mappings:
Mappings:
"properties": {
"debug": {
"type": "text"
},
"Event_date": {
"format": "strict_date_optional_time||epoch_millis||yyyy/MM/dd HH:mm:ss||yyyy/MM/dd'T'HH:mm:ss.SSSSX",
"index": true,
"ignore_malformed": true,
"store": false,
"type": "date",
"doc_values": true
},
"message": {
"type": "text"
},
"TimeStamp": {
"format": "date_optional_time||strict_date_optional_time",
"index": true,
"ignore_malformed": true,
"store": false,
"type": "date",
"doc_values": true
}
"debug": {
"type": "text"
},
}
Any Idea why TimeStamp doesn't appeard on Kibana.
I checked date format and based on article: [format | Elasticsearch Guide [8.8] | Elastic]
date_optional_time or strict_date_optional_time format is enough:
A generic ISO datetime parser, where the date must include the year at a minimum, and the time (separated by T), is optional. Examples: yyyy-MM-dd'T'HH:mm:ss.SSSZ or yyyy-MM-dd.
Thanks for any valuable Input.
Regards,
Karl Wolf