How to create unique id in logstash/elastic search for apache logs in a distributed server environment

without reading too much in to your post here is what I did for unique id

I pay around with fingerprint and did manage to create unique id. but didn't like the idea about it.
here created my own unique id

this was data coming from jdbc connection with number of row. and when I combine these three field it is unique.

For example
Where projectname, systemtypeid and username combination will be unique forever.

filter {
mutate {
add_field => {
"doc_id" => "%{projectname}%{systemtypeid}%{username}"
}
}

This give me exact duplicate of what I receive from database via jdbc.

Now again I was reading same database table with jdbc but wanted to save this data. i.e read a table once a day and save it. again read a data and put it in ES with second day. each document present size of project by username.

to do that I created another document id which can be unique per day

doc_id => "%{projectname}%{systemtypeid}%{username}%{+dd-MM-YYYY}"

and use this in to output section
document_id => "%{doc_id}"