Hi there,
I've setup a JDBC input plugin to read off my SQL Server ErrorLog table with the output being an ES index.
No particular mapping needed but the only requirement is to use the LogDate column as the main @timestamp.
I've tried the date filter but it constantly fails with _dateparsefailure. I've tried several patterns but they dont seem to be working. Reading around I've noticed that the LogDate field is already a date (which should be the reason why the date filter can't parse it). I've then tried to just use the mutate-copy to copy the logdate value into the @timestamp field but no luck either, the @timestamp is always the ingestion date.
Here's my conf:
input { jdbc { type => "errors" jdbc_driver_library => "/opt/bitnami/logstash/drivers/sqljdbc_6.0/enu/jre8/sqljdbc42.jar" jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver" jdbc_connection_string => "jdbc:sqlserver://myServer:1433;databaseName=myDB" jdbc_user => "myUsername" jdbc_password => "myPassword" statement => "select * from errorlog with(nolock)" jdbc_paging_enabled => "true" jdbc_page_size => "5000" schedule => "* * * * *" } } filter { mutate { copy => { "{%logdate}" => "@timestamp" } remove_field => [ "logdate" ] } } output { elasticsearch { hosts => ["127.0.0.1:9200"] index => "errors-%{+YYYY.MM.dd}" document_id => "%{logid}" } }
Any help?