Hi.
I'm using logstash jdbc to pull data from a mysql database. So far so good. This is part of the configuration file:
input {
jdbc {
jdbc_driver_library => "/usr/share/java/mysql-connector-java.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://xx.xx.xx.xx:3306/asterisk"
jdbc_user => "user"
jdbc_password => "password"
schedule => "* * * * *"
statement => "SELECT * FROM cdr where cdr_id > :sql_last_value"
use_column_value => true
tracking_column => cdr_id
}
}
But i'm having some problems trying to parse a date
field called "calldate
" and change his Timezone.
The calldate field looks something like this in the mysql database:
MariaDB [asterisk]> select calldate,cdr_id from cdr order by cdr_id desc limit 1;
+---------------------+--------+
| calldate | cdr_id |
+---------------------+--------+
| 2021-07-23 14:17:38 | 1852 |
+---------------------+--------+
1 row in set (0.001 sec)
MariaDB [asterisk]>
It seems to be in UTC because the local time when i made the call was 10:17:38 AM (I'm based in Chile which is UTC-4)
So, the first goal is catch the calldate
field to parse the information and the change the timezone.
I have tried:
filter {
date {
match => [ "calldate" , "yyyy-MM-dd HH:mm:ss" ]
match => [ "calldate" , "yyyy-MM-dd'T'HH:mm:ss.SSS" ]
match => [ "calldate", "ISO8601" ]
match => [ "calldate", "MMM dd, yyyy @ HH:mm:ss.SSS" ]
timezone => "America/Santiago"
target => "new_calldate"
}
}
But when i run the service i always get this this log for logstash:
Jul 23 10:19:01 reverse logstash: "tags" => [
Jul 23 10:19:01 reverse logstash: [0] "_dateparsefailure"
Jul 23 10:19:01 reverse logstash: ]
Jul 23 10:19:01 reverse logstash: }
And the calldate
field still has the original timezone in elasticsearch.
What i'm doing wrong in order to parse correctly the calldate field?. Is that the correct way to change the timezone for the new_date?
Any help would be much appreciated!
Regards