Setting a filter that transforms "NULL" values to a default value from Mysql to elasticSearch

The fields that I want to get them with default value could be NULL in MYSQL. This is my configuration for the logstash plugin.

input {
    jdbc {
        jdbc_connection_string => "jdbc:mysql://localhost:3306/elements"
        jdbc_user => "user"
        jdbc_password => "admin"
        jdbc_validate_connection => true
        jdbc_driver_library => "C:/work/Wildfly/wildfly-9.0.2.Final/modules/com/mysql/main/mysql-connector-java-5.1.36.jar"
        jdbc_driver_class => "com.mysql.jdbc.Driver"
        statement_filepath => "query.sql"
        use_column_value => true
        tracking_column => id
        #schedule => "*/3 * * * *"
        clean_run => true

    }
}

output {
    elasticsearch {
        index => "emptytest"
        document_type => "history"
        document_id => "%{id}"
        hosts => "localhost"
    }

I tried this filter but the condition does not detect the NULL values.

if [sourcecell_id] == "NULL" {

         mutate {

         }
    }

I'm not sure Logstash handles null values very gracefully. Can't you transform the null values in the SQL query instead?

I can but that would be the last solution. The farthest thing that I did is via a ruby script is to delete the null columns but I want to substitute the null with default value like 0 for example.
This is the script that I found:

 filter {
     ruby {
                code => "
                        hash = event.to_hash
                        hash.each do |k,v|
                                if v == nil
                                        event.remove(k)
                                end
                        end
                "
        }

}

Yes, a ruby filter would do. Replace event.remove(k) with event[k] = 0.

1 Like