Elasticsearch / logstash Log time shift

I currently have a small problem and I don't know why it happens.

I have my log

2023-05-09 09:20:11 [DEBUG] org.apache.activemq.transport.AbstractInactivityMonitor:150 -> WriteChecker: 10000ms elapsed since last write check.

Here is my grok pattern :

%{TIMESTAMP_ISO8601:time} \[%{LOGLEVEL:log_level}\] %{GREEDYDATA:message_of_log}

Why the value of the time field is off by 2 hours on all my logs in Kibana and how can I fix that ?

If you do not tell logstash what timezone a date is in then it will assume it is UTC. You can use the timezone option to tell the date filter what timezone the log is in.

elasticsearch always stores dates in UTC.

By default Kibana will convert them to the timezone of the browser.

1 Like

How can I tell the timezone to logstash ?

Hello Jackie,

Avec l'option "timezone"

On oublie pas la Fricadelle :wink:

Tu ferais comment ? Voici mon logstash.conf :

if [type] == "localhost" {
		grok{
			match => {"message" => [
				"%{MONTHDAY:day_localhost}-%{MONTH:month_localhost}-%{YEAR:year_localhost} %{TIME:time_localhost} %{LOGLEVEL:log_level} %{GREEDYDATA:message_of_log}"
			]}
		}
		mutate {
			add_field => {"time" => "%{day_localhost}-%{month_localhost}-%{year_localhost} %{time_localhost}"}
		}
	}

J'ai essayé de faire un truc comme ça mais j'ai eu une erreur :

mutate {
			add_field => {"time" => "%{day_localhost}-%{month_localhost}-%{year_localhost} %{time_localhost}"}
		}
date {
        timezone => "Europe/Brussels"
        target => "time"
    }
}

Mes dates formats sont dans mon index vu que j'en ai plusieurs

Hello, je te laisse lire la documentation, le filtre date prend en input un champ d'une valeur "date" normalisé selon une notation ISO.

Field value : "Apr 17 09:32:01" notation MMM dd HH:mm:ss Ă  toi d'adapter ta configuration.

Prend le temps de lire les documentations tout est expliqué clairement !

Je suis lĂ  si tu as besoin.

I don't know, mĂȘme en suivant la documentation j'ai des soucis :confused:

		date{
			match => ["time", "yyyy-MM-dd HH:mm:ss.SSS", "yyyy-MM-dd HH:mm:ss","dd-MMM-yyyy HH:mm:ss.SSS","yyyy-MM-dd HH:mm:ss,SSS"]
			target => "time"
			timezone => "Europe/Brussels"
		}

J'ai ce message d'erreur dans logstash :

[2023-05-10T12:21:51,742][WARN ][logstash.outputs.elasticsearch][main][e2262610a62f7b06ac2d331304fb7378b3df4ecd82ad4661859fcf24ba8c2236] Could not index event to Elasticsearch. status: 400, action: ["index", {:_id=>nil, :_index=>"test_index2", :routing=>nil}, {"log_level"=>"INFO", "type"=>"app", "message_of_log"=>"org.springframework.jdbc.datasource.init.ScriptUtils:502 -> Executed SQL script from class path resource [eures-basement-dao-support-commit.sql] in 13 ms.", "@version"=>"1", "@timestamp"=>2023-05-10T12:21:44.633442105Z, "message"=>"2023-01-18 11:58:23 [ INFO] org.springframework.jdbc.datasource.init.ScriptUtils:502 -> Executed SQL script from class path resource [basement-dao-support-commit.sql] in 13 ms.", "time"=>2023-01-18T10:58:23.000Z, "event"=>{"original"=>"2023-01-18 11:58:23 [ INFO] org.springframework.jdbc.datasource.init.ScriptUtils:502 -> Executed SQL script from class path resource [eures-basement-dao-support-commit.sql] in 13 ms."}, "host"=>{"name"=>"5f4dca62be5e"}, "log"=>{"file"=>{"path"=>"/var/log/all_logs/common/logs/frames-frontend.log.2023-01-18"}}}], response: {"index"=>{"_index"=>"test_index2", "_id"=>"ovucBYgBIeWTWwa_UrZx", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [time] of type [date] in document with id 'ovucBYgBIeWTWwa_UrZx'. Preview of field's value: '2023-01-18T10:58:23.000Z'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2023-01-18T10:58:23.000Z] with format [yyyy-MM-dd HH:mm:ss.SSS||yyyy-MM-dd HH:mm:ss||dd-MMM-yyyy HH:mm:ss.SSS||yyyy-MM-dd HH:mm:ss,SSS]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}

Il y a plusieurs logs diffĂ©rents qui sont rĂ©cupĂ©rĂ©s Ă  cet endroit donc je dois donner plusieurs format (ce que j'ai fais dans l'index) et j'ai quand mĂȘme un soucis.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.