Logstash adding duplicate rows for every run

Unless you're deleting the primary key fields you can just reference them when you set the document_id option later on, but if you prefer you can set a temporary field in the file above and reference that field instead.

File with jdbc input plugin:

filter {
  mutate {
    add_field => {
      "[@metadata][document_id]" => "%{primary_key_1}%{primary_key_2}"
    }
  }
}

File with elasticsearch output:

output {
  elasticsearch {
    ...
    document_id => "%{[@metadata][document_id]}"
  }
}

Again, if you don't delete the primary key fields the configuration above is equivalent to just this:

output {
  elasticsearch {
    ...
    document_id => "%{primary_key_1}%{primary_key_2}"
  }
}