Automatically pushing PostgreSQL DB Server data to ES for indexing

Hi,

I am running PostgreSQL DB server( version 9.6.11) and pushing it to ES (version 6.5.0) for indexing. I usually run /usr/share/logstash/bin/logstash -f products-index-logstash.conf manually. Is there a way to automate it meaning whenever the PostgreSQL database is updated the indexing is updated in ES or do i need to invoke /usr/share/logstash/bin/logstash -f products-index-logstash.conf via cron scheduler in Linux server?

file: products-index-logstash.conf

input {
jdbc {
# Postgres jdbc connection string to our database, mydb
jdbc_connection_string => "jdbc:postgresql://localhost:5432/elktest?user=postgres"
# The user we wish to execute our statement as
jdbc_user => "postgres"
jdbc_validate_connection => true
# The path to our downloaded jdbc driver
jdbc_driver_library => "/etc/logstash/postgresql-42.2.5.jar"
# The name of the driver class for Postgresql
jdbc_driver_class => "org.postgresql.Driver"
# our query
statement => "SELECT * from products"
}
}
output {
elasticsearch {
index => "elktest"
document_id => "%{product_no}"
}
}

Please suggest. Thanks in Advance.

Best Regards,

Kaushal

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.