Datetime from sql to Timestamp logstash

Hello everybody!
Help me please with @timestamp, i have jdbc input with mssql server, and in my output i have variable - datetime, which include time of created table. How i can use it instead @timestamp?
this is my conf file:

input {
 jdbc {
 jdbc_driver_library => "/etc/logstash/drivers/sqljdbc42.jar"
 jdbc_driver_class => ""
 jdbc_connection_string => "jdbc:sqlserver://server:1433;databasename=db"
 jdbc_user => "login"
 jdbc_password => "pass"
 statement => "SELECT m.*
 , mc.nameRu AS CategoryNameRu
, mt.nameRu AS TypeNameRu
, ms.nameRu AS SourceNameRu
, p.Fio_Ru
, s.RowName AS LoginName
, pos.FullNameRu AS PositionNameRu
  FROM LOG_Messages m
  JOIN DIC_LOG_MESSAGE_SOURCE_TO_TYPE mst ON = m.refMessageSourceType
  JOIN DIC_LOG_MESSAGE_CATEGORY mc ON = mst.refMessageCategory
  JOIN DIC_LOG_MESSAGE_TYPE mt ON = mst.refMessageType
  JOIN DIC_LOG_MESSAGE_SOURCE ms ON = mst.refMessageSource
  LEFT JOIN ULS_Persons p ON = m.refRecordCard
  LEFT JOIN LOG_SidIdentification s ON = m.refSid
  LEFT JOIN ULS_SubdivisionPositions pos ON = m.refPosition

  filter {
   mutate {
add_field => { "message" => "%{typenameru}" }
            convert => [ "datetime", "string" ]
  copy => { "datetime" => "@timestamp" }
  output {
         	host => ""
            port => 12231
            short_message => 'short_message'
  stdout { codec => rubydebug}

You can use the plug-in "date" inside your filter. When you use it to parse your custom date field, it automatically generates the @timestamp field for your events. You don't need to add, convert or copy your original field like the example you posted.

Can you please help me, how to use this plugin in my case?

Try the documentation, they have lots of examples and it is a really easy plug-in to use.

Try it out and let me know if you have problems.

I write this -

    date {
     match => [ "datetime", "YYYY-MM-dd HH:mm:ss.SSS" ]
     target => "@timestamp"

but in output:
datetime - 2018-01-29T22:16:59.537Z
timestamp - 2018-01-30T10:47:36.247Z

1 Like

Hi Anuar,
I have the same problem with Cassandra:
this is the query in statement:
statement => "select * from mydatabase.products_to_elastic WHERE lastmodified > :sql_last_value"
and I obtain this error:

[2018-01-30T12:01:00,702][ERROR][logstash.inputs.jdbc ] Java::ComDatastaxDriverCoreExceptions::InvalidQueryException: Unable to coerce '2018-01-30 11:59:02.709784' to a formatted date (long): select * from mydatabase.products_to_elastic WHERE lastmodified > '2018-01-30 11:59:02.709784' ALLOW FILTERING

I used this filter:
filter {
date {
match => [ "lastmodified", "yyyy-MM-dd HH:mm:ss.SSS" ]

but I don't understand if the filter works with input or only with data in output.

Did you try without filter?

Yes, but the same error.

can you paste your conf file?

Can you help me please! =)

I assume your original field looks like this: 2018-01-29 22:16:59.537

The date filter takes in consideration your platform locale. If you want to see the events as it comes, you can use the time zone setting. And you don't need to set the target to the @timestamp. This is the plugin default.

filter {
  date {
     match => [ "message", "YYYY-MM-dd HH:mm:ss.SSS" ] #2018-01-29 22:16:59.537
     timezone => "Etc/UTC"

This is the output:

[2018-01-31T16:00:38,404][INFO ][logstash.agent] 
2018-01-29 22:16:59.537
       "message" => "2018-01-29 22:16:59.537",
    "@timestamp" => 2018-01-29T22:16:59.537Z

You have to decide if you want to leave this config as is or not, depending on how and where you are visualizing your data. For me, Kibana sets the timestamp on screen by the users browser's locale, so I never set the time zone on Logstash.

I use Graylog.

I check logstash conf file with debug and i saw that, i have two variables with time, first datetime - it is time from DB and second @timestamp. Can i replase timestamp value with datetime values?

I didn't quite understand your question. Isn't it what we have been doing on the messages above?
All the examples were already given. What is exactly the problem?

When you use the date filter plugin you have to give it the field where your date is, so you can parse it. Logstash will automatically put the parsed date on the @timestamp field. You can use target if you want to send it to another field though, that is optional.

After you have done that, if you don't need your "datetime" field anymore, (because you send it parsed to the @timestamp field), you can remove it.

What you write before, i do it, but no any changes in timestamp. Can you please write what exactly and how i must write it?

this us my debug example -

"sourcenameru" => "Задачи",
"loginname" => "OrderPointStatement",
"refrvsproperties" => nil,
"refmessagesourcetype" => 542,
"message" => "Действие",
"positionnameru" => nil,
"content" => "Start ParagraphExecutor.Execute",
"datetime" => 2014-12-31T18:00:00.463Z,
"@timestamp" => 2018-02-01T06:26:05.376Z,
"refposition" => nil,
"clientipaddress" => "",
"refrecordcard" => nil,
"@version" => "1",
"id" => 14232235,
"refarchive" => 50,
"refsid" => 3726,
"categorynameru" => "Информация",
"typenameru" => "Действие",
"fio_ru" => nil

And i need to replace them.


I am also facing the same issue. I am unable to parse @timestamp value as my wso2 server timestamp.

Below is my logstash conf file.

input {
beats {
port => 5044

filter {
grok {
match => [ "message", "TID:%{SPACE}[%{INT:tenant_id}]%{SPACE}[]%{SPACE}[%{TIMESTAMP_ISO8601:event_timestamp}]%{SPACE}%{LOGLEVEL:level}%{SPACE}{%{JAVACLASS:java_class}}%{SPACE}-%{SPACE}%{GREEDYDATA:log_message}" ]

	date {
		match => [ "message", "yyyy-MM-dd'T'HH:mm:ss.SSSZ" ]
		timezone => "Etc/UTC"


output {
elasticsearch {
hosts => "localhost:9200"
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"

In kibana:

@timestamp February 1st 2018, 15:39:18.937
t @version 1
t _id 9xvXMFEB-1-QqhW1ljvK
t _index filebeat-2018.02.01
# _score -
t _type doc
t event_timestamp 2018-02-01 15:39:16,016
t java_class org.wso2.carbon.core.init.CarbonServerManager
t level INFO
t log_message Halting JVM {org.wso2.carbon.core.init.CarbonServerManager}
t message TID: [-1254] [] [2018-02-01 15:39:16,016] INFO {org.wso2.carbon.core.init.CarbonServerManager} - Halting JVM {org.wso2.carbon.core.init.CarbonServerManager}
t prospector.type log
t source C:\Softwares\WSO2\wso2ei-6.1.0\wso2ei-6.1.0\repository\logs\wso2carbon.log
t tags beats_input_codec_plain_applied, _dateparsefailure
t tenant_id -1234

Please help me.

Even this date filter also not working.

date {
match => ["event_timestamp" , "yyyy-MM-dd'T'HH:mm:ss.SSSZ"]
target => "@timestamp"
add_field => { "debug" => "timestampMatched"}

Anuar, always post examples of your data as you did now. Much easier to try to help. :grinning:

Your datetime field is in the format, "ISO8601".
Try this:

filter {
  date {
     match => [ "datetime", "ISO8601" ]
     timezone => "Etc/UTC"

Would you mind opening a new thread for your issue? It is important to post also a sample of the data you are trying to process in Logstash. And check the answer for Anuar. It might work for you too. If not, open a new thread please.

How i can change example to GMT-6?