Drop milliseconds from timestamp

Hi, I am trying to drop the 4 miliseconds from the timestamp but i cannot make it work and it's given me an error of _dateparsefailure

the bit where I have the error is:

grok {
			match => [ "message", "%{TIMESTAMP_ISO8601:fechalog} %{LOGLEVEL:Severity} %{GREEDYDATA:Message}" ]
		}
		mutate {
			gsub => ["fechalog", "\.\d{4}$"]
		}
		date {
			match => ["fechalog", "YYYY-MM-dd HH:mm:ss" ]
			timezone => "UTC"
		}
		mutate {
			add_field => { "Ubicacion" => "%{[host][hostname]}-%{[log][file][path]}"} 
		}
		mutate {
			remove_tag => ["beats_input_codec_plain_applied"] 
    		}

an example log...

2020-05-09 22:34:05.0880 ERROR DatosGruas.BOL.Trama.Leer 

I can see that the fechalog field its been filled with the right timestamp. I guess my mutate is not right and that is making fail my date filter???

Thanks in advance

That should produce an error message (logstash.agent.configuration.invalid_plugin_register). Try

mutate { gsub => ["fechalog", "\.\d{4}$", ""] }

That works, thanks.

Anyway I have another little issue...
So my date filter get the logtime and replace it at @timestamp but when I check the logs in Kibana they are 2 hours late.

I know that elasticserarch works with UTC. In kibana you have a option in the settings to change the timezone to Europe/Madrid and also I have change all the timezones to Europe/Madrid in the date filter but still two hours for behind....

Any idea why? What I am doing wrong?

filter {
if "Datos_Gruas" in [tags] {
		grok {
			match => [ "message", "%{TIMESTAMP_ISO8601:fechalog} %{LOGLEVEL:Severity} %{GREEDYDATA:Message}" ]
		}
		mutate {
			gsub => ["fechalog", "\.\d{4}$", ""]
		}
		date {
			match => ["fechalog", "YYYY-MM-dd HH:mm:ss" ]
			timezone => "Europe/Madrid"
		}
		mutate {
			add_field => { "Ubicacion" => "%{[host][hostname]}-%{[log][file][path]}"} 
		}
		mutate {
			remove_tag => ["beats_input_codec_plain_applied"] 
    		}
} else if "MonitorizacionReefers" in [tags] {
	grok {
		match => [ "message", "%{TIME:Hora} - %{DATE_EU:Fecha} %{LOGLEVEL:Severity} %{GREEDYDATA:Message}" ]
	}
	mutate {
		add_field => { "logfecha" => "%{Fecha} %{Hora}"} 
	}
	date {
		match => ["logfecha", "dd-MM-yyyy HH:mm:ss" ]
		timezone => "Europe/Madrid"
	}
	mutate {
		add_field => { "Ubicacion" => "%{[host][hostname]}-%{[log][file][path]}"} 
	}
	mutate {
		remove_tag => ["beats_input_codec_plain_applied"] 
    	}
} else if "Procesos2_SVC" in [tags] {
		grok {
			match => [ "message", "%{BASE10NUM:Fecha} %{TIME:Hora} - %{GREEDYDATA:Message}" ]
		}
		mutate {
			add_field => { "Ubicacion" => "%{[host][hostname]}-%{[log][file][path]}"}
			add_field => { "logtime" => "%{Fecha} %{Hora}"} 
		}
		date {
			match => ["logfecha", "yyyyMMdd HH:mm:ss" ]
			timezone => "Europe/Madrid"
		}
		mutate {
			remove_tag => ["beats_input_codec_plain_applied"] 
    		}

} else if "Procesos_SVC" in [tags] {
		grok {
			match => [ "message", "%{BASE10NUM:Fecha} %{TIME:Hora} - %{GREEDYDATA:Message}" ]
		}
		mutate {
			add_field => { "Ubicacion" => "%{[host][hostname]}-%{[log][file][path]}"}
			add_field => { "logdate" => "%{Fecha} %{Hora}"}
		}
		date {
			match => ["logdate", "yyyyMMdd HH:mm:ss" ]
			timezone => "Europe/Madrid"
		}
		mutate {
			remove_tag => ["beats_input_codec_plain_applied"] 
    		}
} else {
	mutate {
		add_field => { "Ubicacion" => "%{[host][hostname]}-%{[log][file][path]}"} 
		}
		mutate {
		remove_tag => ["beats_input_codec_plain_applied"] 
    		}
} }

Thanks

Are you sure the timestamps in the log file are Europe/Madrid and not UTC? If you query elasticsearch directly, without letting kibana do the presentation, does the timestamp look right?

Hi Badger,
Yeah i,m sure the timestamp un the logs are Europe/Madrid, and in logstash the @timestsmp chance with the actual log time but un kibana move two hours earlier.
How can i query elasticsearch? I can ser the Index with get /indexname but don't know how to see a log from elasticsearch API
Thanks for your help

Use a match all query (it will return a limited subset of the documents). You could use the console in the kibana dev tools to do the PUT to elasticsearch, or connect directly to elasticsearch with curl.

Hi Badger,

I did try to do the search in elasticsearch and it looks like the timestamp is alright.

{
  "took" : 3468,
  "timed_out" : false,
  "_shards" : {
"total" : 1,
"successful" : 1,
"skipped" : 0,
"failed" : 0
  },
  "hits" : {
"total" : {
  "value" : 10000,
  "relation" : "gte"
},
"max_score" : 1.0,
"hits" : [
  {
    "_index" : "srv-refcon-procesos-svc",
    "_type" : "_doc",
    "_id" : "zK_GDXIBcHQrn9J24IsS",
    "_score" : 1.0,
    "_source" : {
      "ecs" : {
        "version" : "1.4.0"
      },
      "message" : "20170603 00:25:33 - Ejecuto VERMAS_MANAGER",
      "host" : {
        "architecture" : "x86_64",
        "hostname" : "REFCON",
        "id" : "9be686f6-e89c-4e29-8e5d-685038fea9e8",
        "os" : {
          "version" : "6.3",
          "build" : "9600.19653",
          "family" : "windows",
          "kernel" : "6.3.9600.19",
          "platform" : "windows",
          "name" : "Windows Server"
        },
        "name" : "REFCON"
      },
      "Message" : "Ejecuto VERMAS_MANAGER",
      "Hora" : "00:25:33",
      "logdate" : "20170603 00:25:33",
      "@version" : "1",
      "tags" : [
        "Procesos_SVC",
        "SRV_REFCON"
      ],
      "Fecha" : "20170603",
      "Ubicacion" : """\Procesos_SVC\20170603.log""",
      "agent" : {
        "version" : "7.6.2",
        "hostname" : "REFCON",
        "ephemeral_id" : "ded8bc91-c177-4242-b233-14ff0738c881",
        "type" : "filebeat",
        "id" : "7f9fb557-1c4b-4be8-bee6-695aabdb628a"
      },
      "log" : {
        "offset" : 33555,
        "file" : {
          "path" : """C:\Procesos_SVC\20170603.log"""
        }
      },
      "@timestamp" : "2017-06-03T00:25:33.000Z",
      "input" : {
        "type" : "log"
      }
    }
  }

As you can see the @timestamp has the real logtime but then when I see the logs in kibana they are two hours earlier
Don't know why?

Hi Badger,

I did manage to sort it out, yeah you were right and one of my logs was using UTC already that's why it was giving me that 2 hours miss match.
All sort it now and working!! Thank you for all your help in this topic and other before :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.