Logstash adding duplicate rows for every run

Sure.. We are not deleting the primary key fields here..
In this case, the output is bunch of SQL rows and columns. Lets say I got a column name "startinterval" and I want to use the value from this particular column as primary key for the document_id here.. How can I call the column value of "startinterval" from each row and assign that as "primary_key" here..

I tired to give the column name (startinterval) as document_id and thats only taking it as static value and the output is only one row with the "_id" as that static value.. I figured thats not the right way to call the data here..

output {
elasticsearch {
...
document_id => "%{startinterval}"
}
}

Thanks
Vikram Y