Hello,
I’m currently trying to import log data from a SQL Server database table into Elasticsearch using Logstash. The SQL table includes a field named “logDateTime” for the timestamp of the log entries. I want to use this field as the “@timestamp” value in Elasticsearch.
Despite configuring the Logstash pipeline, the “@timestamp” field is not being updated correctly. I’ve tried various solutions, including adjusting the date format and using the mutate filter, but the issue persists.
Here’s my current pipeline configuration:
input {
tcp {
port => 5000
codec => json
}
beats {
port => 5044
}
jdbc {
jdbc_driver_library => "/usr/share/logstash/mssql-jdbc-12.6.1.jre11.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://devwsql1;database=RPSTest;encrypt=false"
jdbc_user => "logstash"
jdbc_password => "PASSWORD"
statement => "SELECT * FROM log"
type => "jdbc"
}
}
filter {
mutate {
convert => { "logDateTime" => "string" }
copy => { "logDateTime" => "new_logDateTime" }
}
date {
match => ["new_logDateTime", "yyyy-MM-dd HH:mm:ss.SSS"]
target => "@timestamp"
}
}
output {
if "_dateparsefailure" in [tags] {
stdout {
codec => rubydebug
}
}
if [type] == "jdbc" {
elasticsearch {
hosts => ["http://prodxcontainer2:9200"]
user => "elastic"
password => "PASSWORD"
index => "devwsql1-logs"
}
} else {
elasticsearch {
hosts => ["http://prodxcontainer2:9200"]
user => "elastic"
password => "PASSWORD"
}
}
}
The logDateTime
values in my SQL Server database are in the format “yyyy-MM-dd HH:mm:ss.SSS”. I’m using Logstash version 8.13.0 and Microsoft SQL Server JDBC Driver version 12.6.1 with Java Runtime Environment (JRE) 11.
Any guidance on how to resolve this issue would be greatly appreciated.
Thank you!