Can't parse timestamp to @timestamp

Hello Folks...
i am sorry for that Question, i saw a lot of topics about the same problem, but there isn't one which is helpfull in my case.

In Logstash i would like to parse my timestamp to @timestamp but i receive a _dateparsefailure.

This is my Log:
129.70.76.43 - - [03/Jan/2017:14:39:38 +0100] "GET /resources/mail.png HTTP/1.1" 304 181 "http://www.jrabar.kuchen/qukkkkkkmaneeent/weirtecung-rofldiekatz.html" "Mozilla/5.0 (Windows NT 6.1; rv:50.0) Gecko/20100101 Firefox/50.0"

This is my config for Logstash:
...input ....

filter {
    if [type] == "apache-access" {
            grok {
                    match => { "message" => "%{COMBINEDAPACHELOG}" }
            }
            date {
                    match => [ "timestamp", "dd/MM/YYYY:HH:mm:ss Z" ]
            }
    }
}
output {

    stdout { }
    elasticsearch {
            
     }
}

I know, that my Logfile has three letters for the month. Shouldn't be the matching option something like this?
match => [ "timestamp", "dd/**MMM**/YYYY:HH:mm:ss Z" ]
If i use MMM, i receive a _grokeparsingfailure

Can someone give me a hint?

For my understanding: parsing the timestamp with date is usefull, so the date of the event in the log will be the same as in elasticsearch. Furthermore, it gives me the possibilty to search with it, am I right?

In addition, is there somewhere a videotutorial about "how to use grok/how to parse and index logs"? For me, all the regular expressions are looking highly complicated. Even with my "Logstash"-Book and the Grok Debugger i am feeling like "bruteforcing" a correct configuration for the filter. :disappointed:

I know, that my Logfile has three letters for the month. Shouldn't be the matching option something like this?

Yes, you need MMM.

If i use MMM, i receive a _grokeparsingfailure

No. Changing the date filter configuration does not affect the preceding grok filter. Show us the event that got the _grokparsefailure tag. You can copy/paste from the JSON tab in Kibana.

changed config to match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]

{
  "_index": "logstash-2017.02.08",
  "_type": "apache-access",
  "_id": "AVocvv1OD35rHpO5agFt",
  "_score": null,
  "_source": {
"path": "/home/grzechca/log/access/neuAccess.log",
"@timestamp": "2017-02-08T08:02:43.028Z",
"@version": "1",
"host": "elkelastic",
"message": "91.55.213.178 - - [03/Jan/2017:14:39:56 +0100] \"GET /images/jv_employer/1/6/4/5/3/logo_image_path/standart/blabla.png HTTP/1.1\" 200 11220 \"http://www.jrabar.kuchen/qukkkkkkmaneeent/biologie-life-sciences/wissenschaftlrofldiekatz.html\" \"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 S",
"type": "apache-access",
"tags": [
  "_grokparsefailure"
]
  },
  "fields": {
"@timestamp": [
  1486540963028
]
  },
  "sort": [
1486540963028
  ]
}

The log line in your example ends like this:

"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 S

This is clearly garbage. The grok filter fails because the double quote is never closed. Double check what the input file looks like.

Well, sorry about that. It seems, that i get only events listed who failed to parse, like my example. All the other events with logs like this example

193.196.237.20 - - [03/Jan/2017:15:01:01 +0100] "GET /resources/zu_styles/om/ghrt920.jpg?1454460 HTTP/1.1" 304 182 "http://www.jrabar,kuchen/resources/jr_styles/stylesheets/styles.css?v=20161231" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:50.0) Gecko/20100101 Firefox/50.0"

are going well through stdout, but i can't see them in kibana >.<

Are you looking at the correct date range in Kibana? Are you sure they aren't ending up in the wrong index? Is there anything interesting in the Logstash logs?

I am rly thankful for your hint.

Some logs were wrong and thats the reason why I only saw the "_grokparsedfailure"-logs in kibana. Because of that, I thought I did a mistake.

In addition, the timestamp is working for the right-parsed files.
Thank you very much for your fast respond and sorry for "spaming" the forum with thislow-level issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.