Datetime from sql to Timestamp logstash

(Anuar Mukatov) #7

Did you try without filter?

(Giuseppe) #8

Yes, but the same error.

(Anuar Mukatov) #9

can you paste your conf file?

(Anuar Mukatov) #10

Can you help me please! =)

(Eric Ohtake) #11

I assume your original field looks like this: 2018-01-29 22:16:59.537

The date filter takes in consideration your platform locale. If you want to see the events as it comes, you can use the time zone setting. And you don't need to set the target to the @timestamp. This is the plugin default.

filter {
  date {
     match => [ "message", "YYYY-MM-dd HH:mm:ss.SSS" ] #2018-01-29 22:16:59.537
     timezone => "Etc/UTC"

This is the output:

[2018-01-31T16:00:38,404][INFO ][logstash.agent] 
2018-01-29 22:16:59.537
       "message" => "2018-01-29 22:16:59.537",
    "@timestamp" => 2018-01-29T22:16:59.537Z

You have to decide if you want to leave this config as is or not, depending on how and where you are visualizing your data. For me, Kibana sets the timestamp on screen by the users browser's locale, so I never set the time zone on Logstash.

(Anuar Mukatov) #12

I use Graylog.

(Anuar Mukatov) #13

I check logstash conf file with debug and i saw that, i have two variables with time, first datetime - it is time from DB and second @timestamp. Can i replase timestamp value with datetime values?

(Eric Ohtake) #14

I didn't quite understand your question. Isn't it what we have been doing on the messages above?
All the examples were already given. What is exactly the problem?

When you use the date filter plugin you have to give it the field where your date is, so you can parse it. Logstash will automatically put the parsed date on the @timestamp field. You can use target if you want to send it to another field though, that is optional.

After you have done that, if you don't need your "datetime" field anymore, (because you send it parsed to the @timestamp field), you can remove it.

(Anuar Mukatov) #15

What you write before, i do it, but no any changes in timestamp. Can you please write what exactly and how i must write it?

(Anuar Mukatov) #16

this us my debug example -

"sourcenameru" => "Задачи",
"loginname" => "OrderPointStatement",
"refrvsproperties" => nil,
"refmessagesourcetype" => 542,
"message" => "Действие",
"positionnameru" => nil,
"content" => "Start ParagraphExecutor.Execute",
"datetime" => 2014-12-31T18:00:00.463Z,
"@timestamp" => 2018-02-01T06:26:05.376Z,
"refposition" => nil,
"clientipaddress" => "",
"refrecordcard" => nil,
"@version" => "1",
"id" => 14232235,
"refarchive" => 50,
"refsid" => 3726,
"categorynameru" => "Информация",
"typenameru" => "Действие",
"fio_ru" => nil

And i need to replace them.

(Vaidehi9039) #17


I am also facing the same issue. I am unable to parse @timestamp value as my wso2 server timestamp.

Below is my logstash conf file.

input {
beats {
port => 5044

filter {
grok {
match => [ "message", "TID:%{SPACE}[%{INT:tenant_id}]%{SPACE}[]%{SPACE}[%{TIMESTAMP_ISO8601:event_timestamp}]%{SPACE}%{LOGLEVEL:level}%{SPACE}{%{JAVACLASS:java_class}}%{SPACE}-%{SPACE}%{GREEDYDATA:log_message}" ]

	date {
		match => [ "message", "yyyy-MM-dd'T'HH:mm:ss.SSSZ" ]
		timezone => "Etc/UTC"


output {
elasticsearch {
hosts => "localhost:9200"
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"

In kibana:

@timestamp February 1st 2018, 15:39:18.937
t @version 1
t _id 9xvXMFEB-1-QqhW1ljvK
t _index filebeat-2018.02.01
# _score -
t _type doc
t event_timestamp 2018-02-01 15:39:16,016
t java_class org.wso2.carbon.core.init.CarbonServerManager
t level INFO
t log_message Halting JVM {org.wso2.carbon.core.init.CarbonServerManager}
t message TID: [-1254] [] [2018-02-01 15:39:16,016] INFO {org.wso2.carbon.core.init.CarbonServerManager} - Halting JVM {org.wso2.carbon.core.init.CarbonServerManager}
t prospector.type log
t source C:\Softwares\WSO2\wso2ei-6.1.0\wso2ei-6.1.0\repository\logs\wso2carbon.log
t tags beats_input_codec_plain_applied, _dateparsefailure
t tenant_id -1234

Please help me.

Even this date filter also not working.

date {
match => ["event_timestamp" , "yyyy-MM-dd'T'HH:mm:ss.SSSZ"]
target => "@timestamp"
add_field => { "debug" => "timestampMatched"}

(Eric Ohtake) #18

Anuar, always post examples of your data as you did now. Much easier to try to help. :grinning:

Your datetime field is in the format, "ISO8601".
Try this:

filter {
  date {
     match => [ "datetime", "ISO8601" ]
     timezone => "Etc/UTC"

Logstash datetime timezone
(Eric Ohtake) #19

Would you mind opening a new thread for your issue? It is important to post also a sample of the data you are trying to process in Logstash. And check the answer for Anuar. It might work for you too. If not, open a new thread please.

(Anuar Mukatov) #20

How i can change example to GMT-6?

(Anuar Mukatov) #21

I tried it, but without changes.

(Anuar Mukatov) #22

my output -


(Eric Ohtake) #23

Anuar, from this point I don’t have any more suggestions for you besides asking you to analyze carefully your debug log and read the docs once again. Also, what always help me is to start a fresh config file, use the stdin input plugin, no filters to start with, output to stdout and paste only the data you are trying to parse onto console. From there, build my filters configuration piece by piece.

It will make it easier to pinpoint the problem. Also reread all this topic, I think you are missing something. Good luck!

(Anuar Mukatov) #24

Ok, but last question, may be it is beacause of i use jdbc plugin input?

(Anuar Mukatov) #25

Hello, i found how to do it)
It turns out logstash does not understand that datetime it is date value.
Need before convert this field to string -

filter {
    mutate {
	add_field => { "message" => "%{typenameru}" }
        convert => [ "datetime", "string" ]
date {
    timezone => "Etc/GMT-6"
    match => ["datetime" , "ISO8601", "yyyy-MM-dd HH:mm:ss.SSS"]
    target => "@timestamp"
    remove_field => [ "datetime", "timestamp" ]

(system) #26

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.