Issue with @timestamp field when importing data from SQL Server to Elasticsearch using Logstash


I’m currently trying to import log data from a SQL Server database table into Elasticsearch using Logstash. The SQL table includes a field named “logDateTime” for the timestamp of the log entries. I want to use this field as the “@timestamp” value in Elasticsearch.

Despite configuring the Logstash pipeline, the “@timestamp” field is not being updated correctly. I’ve tried various solutions, including adjusting the date format and using the mutate filter, but the issue persists.

Here’s my current pipeline configuration:

input {
  tcp {
    port => 5000
    codec => json
  beats {
    port => 5044
  jdbc {
    jdbc_driver_library => "/usr/share/logstash/mssql-jdbc-12.6.1.jre11.jar"
    jdbc_driver_class => ""
    jdbc_connection_string => "jdbc:sqlserver://devwsql1;database=RPSTest;encrypt=false"
    jdbc_user => "logstash"
    jdbc_password => "PASSWORD"
    statement => "SELECT * FROM log"
    type => "jdbc"

filter {
  mutate {
    convert => { "logDateTime" => "string" }
    copy => { "logDateTime" => "new_logDateTime" }
  date {
    match => ["new_logDateTime", "yyyy-MM-dd HH:mm:ss.SSS"]
    target => "@timestamp"

output {
  if "_dateparsefailure" in [tags] {
    stdout {
      codec => rubydebug
  if [type] == "jdbc" {
    elasticsearch {
      hosts => ["http://prodxcontainer2:9200"]
      user => "elastic"
      password => "PASSWORD"
      index => "devwsql1-logs"
  } else {
    elasticsearch {
      hosts => ["http://prodxcontainer2:9200"]
      user => "elastic"
      password => "PASSWORD"

The logDateTime values in my SQL Server database are in the format “yyyy-MM-dd HH:mm:ss.SSS”. I’m using Logstash version 8.13.0 and Microsoft SQL Server JDBC Driver version 12.6.1 with Java Runtime Environment (JRE) 11.

Any guidance on how to resolve this issue would be greatly appreciated.

Thank you!

Hi @hcherry Welcome to the community.

You will need to show us sample documents with the time fields etc and the errors (redact sensitive information) The more complete you can be the better...

What do you see in the rubydebug output when you get a _dateparsefailure?

Are you sure the date format is correct? If the column type in the DB is date, then I would expect the jdbc input will auto-convert that to a LogStash::Timestamp object, then the mutate covert will create a string in IS08601 format (yyy-MM-dd'T'HH:mm:ss.SSSZ), so the date filter will fail.

We really need to see an object from the rubydebug output.

Thanks for the welcome! Here is a sample document:

  "_index": "devwsql1-logs",
  "_id": "yXIPXo8BpMPui0NQhu28",
  "_version": 1,
  "_seq_no": 754932,
  "_primary_term": 3,
  "found": true,
  "_source": {
    "logtype": 0,
    "logemail": "",
    "logdelaysend": 0,
    "logheaders": null,
    "logemail2": null,
    "type": "jdbc",
    "logrid": 95295463,
    "logdatetime": "2024-04-05T10:25:07.000Z",
    "loguser": null,
    "@version": "1",
    "logrefid": 0,
    "logcategory": null,
    "logobject": null,
    "logdescription": "$BuildList 2\r\n\r\n object = , method = $timer, linetext = Do method $MainProcedure, line = 3\r\n object = , method = $MainProcedure, linetext = Do iEventObj.$TriggerAlert(18,kFalse), line = 43\r\n object = , method = $TriggerAlert, linetext = Do $cinst.$BuildList(con('where eveEventID = ',pEventID),,,iTables), line = 6\r\n object = , method = $BuildList, linetext = Calculate lList as sys(192), line = 46\r\n\r\nUserName = Middleware Dev\r\nCompany = ReleasePoint\r\n$nativeerrorcode = 207\r\n$nativeerrortext = [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]Invalid column name 'aleCarrier'.\r\n$errorcode = kStatementNoResults\r\n$errortext = There are no results to process\r\nFetch Result = 2\r\nTable was Events\r\nTop Window = W_MWProcess\r\nUser = SYS\r\nWhere = where eveEventID = 18\r\nList Count = 0\r\nStatement = set transaction isolation level read uncommitted SELECT Alerts.aleRID,Alerts.aleName,Alerts.aleEMail,Alerts.aleSMS,Alerts.alePhone,Alerts.aleFromTime,Alerts.aleToTime,Alerts.aleEmpCode,Alerts.aleCarrier,Events.eveRID,Events.eveAlertRID,Events.eveLevel1,Events.eveLevel2,Events.eveLevel3,Events.eveLevel1Delay,Events.eveLevel2Delay,Events.eveLevel3Delay,Events.eveTriggerTime,Events.eveActiveReset,Events.eveEventID,Events.eveLevel1Trig,Events.eveLevel2Trig,Events.eveLevel3Trig,Events.eveExtraMsg,Events.eveLockOut,Events.eveRepeat1,Events.eveRepeat2,Events.eveRepeat3,Events.eveRepTime1,Events.eveRepTime2,Events.eveRepTime3,Events.eveMessage,Events.eveSuspend,Events.eveMachine,Events.eveType,Events.eveMonFolder,Events.eveMonFilter,Events.eveMonMinutes,EventText.evtRID,EventText.evtEvent,EventText.evtMsg,EventText.evtResolution,EventText.evtThresholdAmt,EventText.evtDelay,EventText.evtExecute,EventText.evtSuspend FROM Events join Alerts on eveAlertRID = aleRID join EventText on eveEventID = evtRID where eveEventID = 18\r\nAfter Relogon, $PerformSQL returned 0\r\nAfter Relogon, $PerformSQL returned 0\r\n$BuildList $PerformSQL fail, $nativeerrorcode = 207\r\n$BuildList $PerformSQL fail, $nativeerrortext = [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]Invalid column name 'aleCarrier'.\r\n$BuildList $PerformSQL fail, $errorcode = kStatementExecDirectFailed\r\n$BuildList $PerformSQL fail, $errortext = The statement could not be executed directly\r\nList Count = 0\r\n",
    "@timestamp": "2024-05-09T15:53:46.583269619Z",
    "logseverity": 1,
    "logheading": "$BuildList Returned Error after Fetch",
    "logmethod": null

I do not get anything from the rubydebug - no errors are output.
I believe the date format is correct. Here is a sample from my SQL table:

2024-04-05 10:10:34.000

Thanks for the help!

What are you using to display that and what is the column type?

Also, do you understand that field names are case sensitive and the jdbc will fold field names to lowercase? It is possible that your mutate and date filters are no-ops.

Looking at [logdatetime] it does not have the format that you think it does.

1 Like

Perhaps Try

date {
    match => ["logdatetime", "yyyy-MM-dd'T'HH:mm:ss.SSSZ"]
    target => "@timestamp"

Fixed lowercase...

Thank you very much! It was the case. I was not aware jdbc would force the field name to be all lowercase.