My log has the timestamp splited i need to join various fields into a timestamp unique field


(Marcelo) #1

Hi guys,

In my application log the fields separated by character ";"

ab929caa-1c44-4a1d-bf76-c43e7a5fa4ca;rf;POST;https://my.domain.com/system/list.seam;2017;07;20;11;56;25;586;4;328;172.18.98.33;ajp-internalserver.tjpe.gov.br%2F111.222.333.444-8009-18;true;8DCF7189317EA31845A198C0F6CBC0D1.saplpje1gi01-1g;665613;myuser;profileinfo;Mozilla/5.0 (Windows NT 6.1${s} rv:27.0) Gecko/20100101 Firefox/27.0;

in 2017;07;20;11;56;25;586 is year;month;day;hour;minute;second;millisecond

my grok filter split into fields:
^%{DATA:uuid};%{WORD:request_status};%{WORD:method};%{URI:url};%{YEAR:year};%{MONTHNUM:month};%{MONTHDAY:day};%{HOUR:hour};%{MINUTE:minute};%{SECOND:second};%{INT:milisecond};(?:%{INT:Nao_sei});(?:%{INT:elapsed_time_request}|);%{IPV4:ip_solicitante};%{DATA:thread};%{DATA:session};%{DATA:session_id};%{DATA:user_id};%{DATA:user_name};%{DATA:profile};%{DATA:agent};

But i need to join into a timestamp field
Im triyng to join with mutate

 mutate {
     add_field => {
       "timestamp" => "%{year}-%{month}-%{day}T%{hour}:%{minute}:%{second}.%{millisecond}Z"
     }

i've receive the tag _dateparsefailure in all messages and in kibana shows 2 timestamp fields timestamp and @timestamp with diferent values.

@timestamp formated as July 23rd 2017, 14:12:39.455 is when the beat.host sends de data to my logstash server

timestamp formated as 2017-7-23T14:12:38.466Z is about the log events, is this field it i need to represent the original timestamp


(Magnus Bäck) #2

Please show the configuration of all your filters.


(Marcelo) #3

apache.conf:

input { stdin { }}

filter {
  if [type] == "log" {
    grok {
      match => { "message" => '^%{HOSTNAME:VirtualHost} %{IPV4:clientip} "%{NOTSPACE:balancer_worker_name}" %{NOTSPACE:remote_log_name} %{NOTSPACE:user} \[%{HTTPDATE:timestamp}\] "(?:((%{NOTSPACE:Method} %{NOTSPACE:request})|(%{WORD:Meth
od} %{DATA:request} HTTP/%{NOTSPACE:httpversion})|(-)))" %{NOTSPACE:response} (?:%{NUMBER:ResponseSize:int}|-) %{QUOTEDSTRING:referrer} %{QUOTEDSTRING:agent} (?:%{NUMBER:TimeTaken:int}|-) (?:%{NUMBER:BytesReceived:int}|-) (?:%{NUMBER:Byt
esSents:int}|-)'}
    }
    mutate {
      gsub => [
        "BytesRecieved", "-", "0",
        "ResponseSize", "-", "0",
        "BytesSents", "-", "0",
        "TimeTaken", "-", "0"
      ]
      convert => {
        "BytesRecieved" => "integer"
        "ResponseSize" => "integer"
        "BytesSents" => "integer"
        "TimeTaken" => "integer"
      }
    }
    date {
      match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
    }

    geoip {
      source => "clientip"
    }
  }
}
output {
  elasticsearch { hosts => ["localhost:9200"]}
  stdout { codec => rubydebug}

mysystem.conf:

input { stdin { }}

filter {
  if "PJE-PROFILER" in [tags] {
    grok {
      match => { "message" => '^%{DATA:uuid};%{WORD:request_status};%{WORD:method};%{URI:url};%{WORD:year};%{WORD:month};%{WORD:day};%{WORD:hour};%{WORD:minute};%{WORD:second};%{WORD:millisecond};(?:%{INT:Nao_sei});(?:%{INT:elapsed_time_
request}|);%{IPV4:ip_solicitante};%{DATA:thread};%{DATA:session};%{DATA:session_id};%{DATA:user_id};%{DATA:user_name};%{DATA:profile};%{DATA:agent};'}
    }
    mutate {
     convert => {
       "elapsed_time_request" => "integer"
     }
      add_field => {
       "timestamp" => "%{year}-%{month}-%{day}T%{hour}:%{minute}:%{second}.%{millisecond}Z"
      }
    }
    date {
      match => [ "timestamp", "dd/MM/YYYY:HH:mm:ss Z" ]
    }
    geoip {
      source => "ip_solicitante"
    }
  }
}
output {
  elasticsearch { hosts => ["localhost:9200"]}
  stdout { codec => rubydebug}
}

(Magnus Bäck) #4

If the timestamp field really contains "2017-7-23T14:12:38.466Z" then "dd/MM/YYYY:HH:mm:ss Z" is obviously the wrong date pattern.


(Marcelo) #5

Magnus,

I correct the format of the timestamp and works fine.


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.