Logstash jdbc config file with filter


I'm new with logstash jdbc plugin and somehow I set up the migration of data from MySQL database to Elasticsearch via Logstash by using this configuration:

input {
  jdbc {
    jdbc_driver_library => "/usr/share/java/mysql-connector-java-5.1.45.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://localhost:3306/testdb"
    jdbc_user => user
    jdbc_password => password
    schedule => "* * * * * *"
output {
  elasticsearch {
    document_id => "%{id}"
    document_type => "doc"
    index => "test2"
    hosts => [""]
  codec => rubydebug

It shows me which user is active at the current in Kibana but it changes the content of stored data because the log message has the same id number. You can see it in the pictures:

So until a particular user doesn't log out it changes the databse the user is using with the same id so it is not possible to search what databse was user using in the past.

How can I set Logstash to store each record with different id or just timestamp so it would be possible to see the previous records...

I tried to explain it as much as I could so if something is unclear just let me know and I try to clarify :smiley:

Thanks for any advices!!

you can create your own document_id

   mutate { add_field => { "doc_id" => "%{id}-%{user}-%{@timestamp}" } }

output {
   document_id => "%{doc_id}"

this will create new id = 4-root-timestamp

and it will not overwrite old record.

Thank you very much for help! It does what I wanted )))

Out of curiosity do you have any idea how to set up the same migration mechanism for MariaDB ? I tried it by using the same config file like I showed in this post but there is some problem with jdbc Driver class and maybe library...

never worked with MariaDB. sorry

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.