Date from DB is changed when inserted in ES

Hi all,

Having an issue with a date field when trying to insert it in ES. Searched and tried multiple modes but to no avail.
Short story, my date field from Postgres ends up in ES with a timezone attached/offset, even though my DB is set in UTC.

Long story,

Postgres DB:

timezone = UTC
field_date = date (YYYY-MM-DD)

Logstash (the server running it is in timezone = JST)

input {
  jdbc {
     jdbc_connection_string => "jdbc:postgresql://server_name:port/db_name"
     jdbc_user => "user"
     jdbc_password => "pass"
     jdbc_driver_class => "org.postgresql.Driver"
     statement => "select * from function_name()"
     clean_run => true
     last_run_metadata_path => "/tmp/something"
filter {
  date {
    match => [ "field_date", "yyyy-MM-dd" ]
    timezone => "UTC"
    target => "field_date"
output {
  elasticsearch {
    hosts => ["http://server:9200"]
    index => "index_smt"
    document_id => "index_smt_%{field1}_%{field2}_%{field_date}"
    doc_as_upsert => true

When executing my function in the DB i will have something like this:

field1 | field2 | field_date
test   | test   | 2021-09-28

This is correct and it's what I expect.
Even when running a psql -c "select * from function_name()" from my Logstash server, the result set is the one expected.

When I check the _id field in ES, somehow it's like this:


So, somehow the date is offset, clearly with the JST offset, but I don't know where this is applied.

I would expect that the _id field would be:


Tried using combinations of:

jdbc_default_timezone => "UTC" / "JST"
plugin_timezone => "local" / "utc"

Nothing seem to work, I still get the field_date column offset and the hour somehow appended to it.

What am I missing here?
Appreciate any help.

The jdbc input will convert a date or time in the database to a LogStash::Timestamp. The jdbc_default_timezone option can be used to tell it to use a timezone that differs from the default for the system where logstash runs.