Problem with data field in logstash

hi,

I have a strange case regarding my Logstash process.
I'm having a filebeat which sends file-log to Logstash. That file log is updated from old proxy system in the following manner:
logs from 15 minutes are gathered and sent to File Server -> on that server is filebeat which reads data and sent data to logstash. After parsing, logs are sent to ES-Cluster.

Here is an example of my text to parsing (that's old proxy access log):

2023/05/15|10:32:56|Greedy Text

Here is my input on Logstash:

input {
  beats {
    port => "24562"
  }
}

here is my logstash filter

filter {
  grok {
    match => { "message" =>"(?<Date>[^|]*)\|%{TIME:Timer}\|%{GREEDYDATA}" }
    #after match I'm having fields: Date: "2023/06/20", Timer: "23:24:34" and "message"
    break_on_match => false
  }
  mutate {
    # here I'm combining date and time together. After that my new field has following structure: "2023/06/20 23:24:34"
    add_field => { "Event_date" => "%{Date} %{Timer}" }
  }
 #then I want add new field (TimeStamp) with date which translate Event_date (where is my time-Zone mark) to UTC (pure Format for ES)
 date {
    match => ["Event_date", "yyyy/MM/dd HH:mm:ss"]
    timezone => "Europe/Berlin"
    target => "TimeStamp"
    add_field => { "debug" => "TimeStampworks"} # that is control filed - will be in production deleted
  }
}

so far so good.
TEST1 (OK): manually started logstash and output to console ( stdout { codec => rubydebug }):

{
    "TimeStamp" => 2023-06-20T21:24:34.000Z,
    "message" => "2023/06/20|23:24:34|MY Text,
    "Event_date" => "2023/06/20 23:24:34",
	"debug" => "TimeStampworks",
    @timestamp" => 2023-06-26T06:25:30.607Z
}

TEST2 (OK): manually started logstash and output to console and send to Elasticsearch ( stdout { codec => rubydebug }):
in Console:

{
    "TimeStamp" => 2023-06-20T21:24:34.000Z,
    "message" => "2023/06/20|23:24:34|MY Text
    "Event_date" => "2023/06/20 23:24:34",
	"debug" => "TimeStampworks",
    @timestamp" => 2023-06-26T06:25:30.607Z
}

in Kibana (OK): and I can filter by TimeStamp:

TEST3: logstash started as daemon: logs are in Kibana, but without TimeStemp field and without debug field.

Here are my mappings:

Mappings:
"properties": {
    "debug": {
      "type": "text"
    },
    "Event_date": {
      "format": "strict_date_optional_time||epoch_millis||yyyy/MM/dd HH:mm:ss||yyyy/MM/dd'T'HH:mm:ss.SSSSX",
      "index": true,
      "ignore_malformed": true,
      "store": false,
      "type": "date",
      "doc_values": true
    },
    "message": {
      "type": "text"
    },
     "TimeStamp": {
      "format": "date_optional_time||strict_date_optional_time",
      "index": true,
      "ignore_malformed": true,
      "store": false,
      "type": "date",
      "doc_values": true
    }
	"debug": {
        "type": "text"
    },
}

Any Idea why TimeStamp doesn't appeard on Kibana.
I checked date format and based on article: [format | Elasticsearch Guide [8.8] | Elastic]

date_optional_time or strict_date_optional_time format is enough:
A generic ISO datetime parser, where the date must include the year at a minimum, and the time (separated by T), is optional. Examples: yyyy-MM-dd'T'HH:mm:ss.SSSZ or yyyy-MM-dd.

Thanks for any valuable Input.

Regards,

Karl Wolf

You need to provide context on what you have in Kibana, please share some screenshot and also the json document from kibana that should have the TimeStamp field but does not.

Hallo leandrojmp,
Thanks for your question.

All my Logs are pushed to ES and saves as DataStream. I started push my logs to ES and after a while I decided to add new field with name TimeStamp where data is formatted with desired Time Zone.

Based on that DataStream I created two Kibana DataViews:
Event_date - first, where Event_date is represented as time when event occured
TimeStamp - second data view where my field with formatted data is marked as a time of event

so I pushed some test data and then compared data views. That's this same document from two different Data Views:
1. Document from Event_date view:

{
  "@timestamp": [
    "2023-06-26T08:15:40.607Z"
  ],
  "debug": [
    "TimeStampworks"
  ],
  "debug.keyword": [
    "TimeStampworks"
  ],
  "Event_date": [
    "2023-06-20T23:24:34.000Z"
  ],
  "message": [
    "2023/06/20|23:24:34|GREEDY_TEXT"
  ],
  "message.keyword": [
    "2023/06/20|23:24:34|GREEDY_TEXT"
  ],
  "TimeStamp": [
    "2023-06-20T21:24:34.000Z"
  ],
  "_id": "eZnF9ogBS-0ti3Wf4K0b",
  "_index": ".ds-proxy-alias-2023.06.26-000011",
  "_score": null
}

2. Document from TimeStamp view:

{
  "@timestamp": [
    "2023-06-26T08:15:40.607Z"
  ],
  "debug": [
    "TimeStampworks"
  ],
  "debug.keyword": [
    "TimeStampworks"
  ],
  "Event_date": [
    "2023-06-20T23:24:34.000Z"
  ],
  "message": [
    "2023/06/20|23:24:34|GREEDY_TEXT"
  ],
  "message.keyword": [
    "2023/06/20|23:24:34|GREEDY_TEXT"
  ],
    "TimeStamp": [
    "2023-06-20T21:24:34.000Z"
  ],
  "_id": "eZnF9ogBS-0ti3Wf4K0b",
  "_index": ".ds-proxy-alias-2023.06.26-000011",
  "_score": null
}

3. and now a document which is not present in TimeStamp data View, but is availible on Event_Date Data View (that occurs when logstash is started from terminal console, not as service ( via systemctrl):

{
  "@timestamp": [
    "2023-06-26T07:16:14.637Z"
  ],
  "Event_date": [
    "2023-06-20T23:24:34.000Z"
  ],
  "message": [
    "2023/06/20|23:24:34|GREEDY_TEXT"
  ],
  "message.keyword": [
    "2023/06/20|23:24:34|toku-toku12|GREEDY_TEXT"
  ],
  "_id": "1L-P9ogBQeZPYTaYdtf3",
  "_index": ".ds-proxy-alias-2023.06.26-000011",
  "_score": null
}

On thing about I thinking is a fact, that I started push Logs to ES at the beginning without new fields: debug and TimeStamp. Should just for sure push logs to complete new datastream?

Thanks
KarlWolf

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.