Trying to parse this datefield

Yes, I am using:
curl -X DELETE 'http://localhost:9200/_all'
to clear everything each time.

I'm gonna retry. I'll keep you updated on what happens.

Thank you

Makes much more sense,
This is gonna make my life easier while debugging, Thanks! I'm gonna give it a shot now

Starting up logstash is pretty expensive. You may find it easier to do

<logstash executable path> -r --path.settings <logstash config directory> -f <pipeline config file path> --debug

The -r tells it to re-read the config file every time it changes, then you can feed it another line on stdin after the re-load.

1 Like

okay. this is officially driving me crazy.
I can see that the date is failing to parse in the generated JSON. but there's no related logs in logstash logs.
I also tried all different combinations i can think off. Can you guys please take a look?

Here's my filter:

input {
  stdin {
    id => "my_plugin_id"
  }
}


filter {
   csv {
      columns => [
          "logdate",
          "ppoeip",
          "userip",
          "protocol",
          "logserverip",
          "destinationip",
          "remove1",
          "remove2",
          "sourceport",
          "destinationport",
          "description-url"
        ]
        separator => ";"
        remove_field => ["remove1", "remove2"]
   }
   date {
        match => [ "logdate", "MMM dd, yyyy HH:mm:ss.SSS ZZZ"]
        target => "logdate"
   }
}


output {
  stdout { }
}

Here's my Input:

"May 31, 2018 09:35:48.979371597 EEST;172.25.35.1;10.8.104.200;SSL;172.25.35.24;216.58.201.202;27786;443;8897;8897;Client Hello" | /usr/share/logstash/bin/logstash --path.settings /etc/logstash -f /etc/logstash/conf.d/logstash.conf --debug

Here's the output:

Sending Logstash's logs to /var/log/logstash which is now configured via log4j2.properties
{
             "ppoeip" => "172.25.35.1",
        "logserverip" => "172.25.35.24",
            "message" => "May 31, 2018 09:35:48.979371597 EEST;172.25.35.1;10.8.104.200;SSL;172.25.35.24;216.58.201.202;27786;443;8897;8897;Client Hello",
    "description-url" => "Client Hello",
               "tags" => [
        [0] "_dateparsefailure"
    ],
      "destinationip" => "216.58.201.202",
         "sourceport" => "8897",
    "destinationport" => "8897",
           "protocol" => "SSL",
         "@timestamp" => 2018-06-06T17:54:27.793Z,
            "logdate" => "May 31, 2018 09:35:48.979371597 EEST",
           "@version" => "1",
               "host" => "Iclik-LOG",
             "userip" => "10.8.104.200"
}

And Finally, Here's the log:

[2018-06-06T20:54:27,459][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@convert = {}
[2018-06-06T20:54:27,460][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@autodetect_column_names = false
[2018-06-06T20:54:27,466][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"date", :type=>"filter", :class=>LogStash::Filters::Date}
[2018-06-06T20:54:27,479][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@match = ["logdate", "MMM dd, yyyy HH:mm:ss.SSS ZZZ"]
[2018-06-06T20:54:27,479][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@target = "logdate"
[2018-06-06T20:54:27,479][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@id = "82ee11addf342f61e7d743aadc55cc4207e69cfe-3"
[2018-06-06T20:54:27,479][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@enable_metric = true
[2018-06-06T20:54:27,479][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@add_tag = []
[2018-06-06T20:54:27,480][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@remove_tag = []
[2018-06-06T20:54:27,480][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@add_field = {}
[2018-06-06T20:54:27,480][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@remove_field = []
[2018-06-06T20:54:27,480][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@tag_on_failure = ["_dateparsefailure"]
[2018-06-06T20:54:27,492][DEBUG][org.logstash.filters.DateFilter] Date filter with format=MMM dd, yyyy HH:mm:ss.SSS ZZZ, locale=null, timezone=null built as org.logstash.filters.parser.JodaParser
[2018-06-06T20:54:27,508][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"stdout", :type=>"output", :class=>LogStash::Outputs::Stdout}
[2018-06-06T20:54:27,514][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"rubydebug", :type=>"codec", :class=>LogStash::Codecs::RubyDebug}
[2018-06-06T20:54:27,517][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@id = "rubydebug_ed456ced-9011-4c54-8989-4fce52c275e2"
[2018-06-06T20:54:27,643][DEBUG][logstash.filters.csv     ] CSV parsing options {:col_sep=>";", :quote_char=>"\""}
[2018-06-06T20:54:27,645][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>1000}
[2018-06-06T20:54:27,816][DEBUG][logstash.pipeline        ] filter received {"event"=>{"@version"=>"1", "host"=>"Iclik-LOG", "@timestamp"=>2018-06-06T17:54:27.793Z, "message"=>"May 31, 2018 09:35:48.979371597 EEST;172.25.35.1;10.8.104.200;SSL;172.25.35.24;216.58.201.202;27786;443;8897;8897;Client Hello"}}
[2018-06-06T20:54:27,817][DEBUG][logstash.filters.csv     ] Running csv filter {:event=>2018-06-06T17:54:27.793Z Iclik-LOG May 31, 2018 09:35:48.979371597 EEST;172.25.35.1;10.8.104.200;SSL;172.25.35.24;216.58.201.202;27786;443;8897;8897;Client Hello}
[2018-06-06T20:54:27,858][DEBUG][logstash.filters.csv     ] filters/LogStash::Filters::CSV: removing field {:field=>"remove1"}
[2018-06-06T20:54:27,859][DEBUG][logstash.filters.csv     ] filters/LogStash::Filters::CSV: removing field {:field=>"remove2"}
[2018-06-06T20:54:27,859][DEBUG][logstash.filters.csv     ] Event after csv filter {:event=>2018-06-06T17:54:27.793Z Iclik-LOG May 31, 2018 09:35:48.979371597 EEST;172.25.35.1;10.8.104.200;SSL;172.25.35.24;216.58.201.202;27786;443;8897;8897;Client Hello}
[2018-06-06T20:54:27,893][DEBUG][logstash.pipeline        ] output received {"event"=>{"ppoeip"=>"172.25.35.1", "logserverip"=>"172.25.35.24", "message"=>"May 31, 2018 09:35:48.979371597 EEST;172.25.35.1;10.8.104.200;SSL;172.25.35.24;216.58.201.202;27786;443;8897;8897;Client Hello", "description-url"=>"Client Hello", "tags"=>["_dateparsefailure"], "destinationip"=>"216.58.201.202", "sourceport"=>"8897", "destinationport"=>"8897", "protocol"=>"SSL", "@timestamp"=>2018-06-06T17:54:27.793Z, "logdate"=>"May 31, 2018 09:35:48.979371597 EEST", "@version"=>"1", "host"=>"Iclik-LOG", "userip"=>"10.8.104.200"}}

Now I Know that According to the docs, EEST is not a supported timezone. But what Am I supposed to do in this case?

Thanks again!

Transform it into something that is has the same offset from UTC, but is supported. According to Wikipedia Moscow time is the same as EEST.

mutate { gsub => [ "logdate", "EEST", "Europe/Moscow" ] }

AWESOME. That solved the problem.
The date is parsed correctly.

However, unlike the original timestamp, logdate still appears as string and not as a datefield for some reason. is there any missing step to add?

You mean in Kibana? Did you start with an empty index and refresh the fields of the index pattern?

yes in Kibana.

I did the following:

curl -X DELETE 'http://localhost:9200/_all'
rm /var/lib/filebeat/registry

And then started logstash then filebeat

Go to Management in the left nav, then Index Patterns, then find your pattern and do a refresh at the top right.

The whole index was deleted and i added the filebeat-* indexes.
but anyway, I tried to refresh the index patterns. Unfortunately, it says it's a string :confused:

What does output { stdout { codec => rubydebug } } produce?

From the logs? Here's what I've got after putting your output setting:

[2018-06-07T20:00:54,051][DEBUG][logstash.pipeline ] output received {"event"=>{"ppoeip"=>"172.25.35.1", "logserverip"=>"172.25.35.24", "offset"=>127, "input_type"=>"log", "source"=>"/var/capture/faridtest/farid.csv", "message"=>"May 31, 2018 09:35:45.979371597 EEST;172.25.35.1;10.8.104.200;SSL;172.25.35.24;216.58.201.202;27786;443;8897;8897;Client Hello", "type"=>"iclick-tcplog", "tags"=>["beats_input_codec_plain_applied"], "description-url"=>"Client Hello", "destinationip"=>"216.58.201.202", "sourceport"=>"8897", "destinationport"=>"8897", "protocol"=>"SSL", "@timestamp"=>2018-06-07T17:00:45.711Z, "logdate"=>2018-05-31T06:35:45.979Z, "@version"=>"1", "beat"=>{"name"=>"Iclik-LOG", "hostname"=>"Iclik-LOG", "version"=>"5.6.9"}, "host"=>"Iclik-LOG", "userip"=>"10.8.104.200"}}

OK, there are no quotes around the value, so at that point it is a logstash timestamp, not a string. Do you have a template that matches that index? If not, I suspect you are not actually deleting the index. (Maybe you are deleting all the docs, but leaving metadata behind?) Can you delete the index by name?

If you run something like this. It shows up as a date, right?

input { generator { message => '2018/05/11 12:34:56' count => 1 } }
filter {
    date { match => [ "message", "yyyy/MM/dd HH:mm:ss" ] target => "foo" remove_field => [ "host", "message" ] }
}
output { elasticsearch { index => "deleteme" hosts => [ "localhost" ] } }

The clean up with

DELETE /deleteme

Doesn't this mean that all indexes are deleted?
This appears every time I run:

curl -X DELETE 'http://localhost:9200/_all

PS: foo did appear as date and not as string

Could another instance be loading data that is string? If you go to discover and query for exists: logdate and then Add logdate do you see what you expect? I ran it in as a string first, then as a date (which gets converted to string since the type is already set). And I see this

AA1

It might be worth starting a new question taking filebeat out of the question and using a minimal configuration like

input { generator { message => 'May 31, 2018 09:35:45.979371597 EEST' count => 1 } }
filter {
    mutate { gsub => [ "message", "EEST", "Europe/Moscow" ] }
    date { match => [ "message", "MMM dd, yyyy HH:mm:ss.SSSSSSSSS ZZZ"] target => "logdate" }
}
output { stdout { codec => rubydebug } }
output { elasticsearch {} }

but I think you will just get told that you either didn't delete the index, or didn't refresh the fields.

I'll give it a fresh start and get back with the result.
since logstash is parsing correctly, as you said, it must be something with the index.

I'll let you know to keep track if anyone had any similar problem in the future.

Thanks again!

No Matter What i do, it's always a string. * Sighs*

Can you please take a look at my output configuration? Am I missing something here?

output {
  elasticsearch {
    hosts => ["localhost:9200"]
#    sniffing => true
#    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
  }
}

Finally, It worked after I Upgraded the whole stack from 5.x to 6.x.
not sure if it was the reason, or it magically solved the problem. but, it worked.

Thank you all for your awesome support

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.