Logstash JDBC analyzers

Hi Everyone, I'm still new to Elastic. I am trying to use the snowball or porter_strem analyzer but I have no Idea how to do it with JDBC Logstash and PostgreSQL, as far as I search and looked for solution,to use this feature, I need to create a custom analyzer and and map the fields then set the custom analyzer for the field that needed the analyzer.

So my question is: how can I do custom mapping on Logstash JDBC indexing? because right now, when I run an index, It will automatically create its own mapping.

Here's my Logstash config example:

input {
    jdbc {
        jdbc_connection_string => "jdbc:postgresql://localhost/testindex?user=[MYUSER]&password=[MYPASS]"
        jdbc_user => "postgres"
        jdbc_driver_library => ""
        jdbc_driver_class => "org.postgresql.Driver"
        schedule => "* * * * *"
        use_column_value => true
		tracking_column => pid
        statement => "SELECT pid, title FROM products"
        clean_run => true
    }
}

filter {
	jdbc_streaming {
	    jdbc_connection_string => "jdbc:postgresql://localhost/testindex?user=[MYUSER]&password=[MYPASS]"
	    jdbc_user => "postgres"
	    jdbc_driver_class => "org.postgresql.Driver"
	    statement => "SELECT text from classes WHERE pid=:pid "
	    parameters => {"pid" => "[pid]"}
	    target => "[classes]"
  }
}

output {
    elasticsearch {
        index => "testindex"
        hosts => "http://localhost:9200"
        document_id => "%{pid}"
    }
}

Please tell me if I am doing it wrong
any help will be appreciated.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.