Logstash not updating documents JDBC Postgres

Hi All,

Every hour my data is updated in Postgres. The data in my table is updated each hour with a full drop of the table and a new bulk insert of the data. Right now, I am running logstash JDBC on a schedule so that 5 minutes after each hour I insert the data into elasticsearch.

My output config is as follows:

input {
    jdbc {
        # Postgres jdbc connection string to our database, mydb
        jdbc_connection_string => "jdbc:postgresql://localhost:5432/xxxxx"
        # The user we wish to execute our statement as
        jdbc_user => "xxx"
        jdbc_password => "xxxx"
       # The path to our downloaded jdbc driver
        jdbc_driver_library => "/usr/share/java/postgresql-jdbc4.jar"
        # The name of the driver class for Postgresql
        jdbc_driver_class => "org.postgresql.Driver"
        # Schedule for input
        schedule => "5 * * * *"
        # our query
        statement => "select planet.id, planet.x || ':' || planet.y || ':' || planet.z coords, planet.x, planet.y, planet.z ,planetname,rulername,race,planet.size,planet.score,planet.value$
    }
}

output {
    elasticsearch {
        hosts => [ "localhost:9200" ]
        index => "universe"
        document_type => "planet"
        document_id => "%{id}"
        template => "/etc/logstash/universe_template.json"
        template_name => "universe"
        template_overwrite => true
        manage_template => true
    }
}

I have noticed however that even though my logstash flow is running I am not seeing the documents updated in elasticsearch. Infact the timestamp is still being shown as yesterday, the first time I ran the script. I assumed by setting my document_id to the ID field of my table that it would create the new document each time and the version number would update in elasticsearch.

Here is an example of the data in postgres:

Here is an example of the current output I see in elasticsearch:

I assume I'm doing something wrong with the document_id but I cant see why.

Things look okay. What if you use a simple stdout { codec => rubydebug } output to take ES out of the equation. Are you getting a full database dump to your log each time the jdbc input is scheduled to run?

In fact it does seem to be working. It is doing a complete replace, I just didn't see the @version number change at all.

The @version field isn't the document's version, it's the Logstash schema version (which isn't very useful).

1 Like

That makes sense now thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.