Using 7.1 Can't set @timestamp from message

I have a simple logstash definition file as shown below.
I can parse my message date properly with no dateparse error shown.
I cannot set the @timestamp with the ObsDate from my message input stream.
As well, the ruby output shows all my csv field names in lower case which is not the case in the sql db.
I have even tried to remove the Target statement in date filter as the documentation states its not required and becomes the default when not specified.

All output are essentially the same as:
{
"avgnetworktraffickbsec" => 0.0,
"@timestamp" => 2019-06-04T14:07:17.536Z,
"obsdate" => 2019-04-30T04:00:00.000Z,
"totalcpuloadmips" => 118.41,
"obshour" => 17,
"@version" => "1"
}

This config file is used to parse the sql data extracted from the db entry:CB_EZ14A

input {
jdbc {
jdbc_driver_library => "/home/pxg110/sqljdbc_4.2/sqljdbc42.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_user => "XXXXXXXXXXX"
jdbc_password => "XXXXXXXXXXXXX"
jdbc_connection_string => "jdbc:sqlserver://SD01CUVDB0521.OMEGA.DCE-EIR.NET:1433;"
statement => "SELECT ObsDate,ObsHour,TotalCPULoadMIPS,AvgNetworkTrafficKBsec FROM smg.dbo.smgdata WHERE (TransClass='CB_EZ14A') AND (ObsDate >= CONVERT(DATETIME, '2019-04-01', 102)) AND (ObsDate <= CONVERT(DATETIME, '2019-04-30', 102)) ORDER BY ObsDate;"
}
}
filter {
fingerprint {
source => "message"
target => "[@metadata][fingerprint]"
method => "SHA1"
key => "Tue_Jun_2019_08_25_CB_EZ14A"
base64encode => true
}

defines all the fields to be found in the csv file.

    csv {
            separator => " "
            columns =>    [
                    "ObsDate",
                    "ObsHour",
                    "TotalCPULoadMIPS",
                    "AvgNetworkTrafficKBsec"
            ]
            convert =>    {
                    "ObsDate"  =>  "date"
                    "ObsHour"  =>  "integer"
                    "TotalCPULoadMIPS"  =>  "float"
                    "AvgNetworkTrafficKBsec"  =>  "float"
            }
    }
    date {
            match => [ "ObsDate", "ISO8601"]
            target => "@timestamp"
   }
    mutate {
            remove_field => [ "ObsDate", "ObsHour"]
    }

}

output {

elasticsearch {

action => "index"

hosts => "localhost:9200"

document_id => "%{[@metadata][fingerprint]}"

index => "Tue_Jun_2019_08_25_CB_EZ14A"

#}
stdout {codec => rubydebug}

stdout {}

}

Note that the value of obsdate does not have quotes around it, so it is not a string, but was already converted to a LogStash::Timestamp by the jdbc filter. A date filter cannot parse that. This is a known issue and the workaround is to mutate+convert the field to a string.

Badger , Thanks for the hint.
I have made the following minor mods based on the "known issue" provided as shown below.

   mutate {
            convert => { "obsdate" => "string" }
    }

Note: It only worked when I renamed my SQL fields name to lower case. EX: ObsDate => obsdate.
I suspects this is a jdc side effect.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.