Duplicate records in elastic search during load of records with Logstash

Hi Team,

I am loading a ServiceNow table data into Elastic search using Logstash config with HTTP Poller Plugin.For example if the table has 40 records,after the load of 40 records in Elastic search, data load is not getting stopped.Same set of 40 records gets loaded repeatedly generating duplicating records in Elastic search index.I tried to give document_id="%{number}" in output plugin under elastic search in config file.number is unique change number in my table.But only one record is loaded in index when I checked in Kibana.Please help on this.

Config File:

input {
http_poller {
urls => {
url => "https://dev63285.service-now.com/change_request_list.do?JSONv2&display_value=True&sysparm_exclude_reference_link=True&sysparm_fields=number%2Cstate%2Copened_at%2Cclosed_at%2Cpriority%2Cassigned_to%2Cassignment_group%2Cactive%2Cimpact%2Curgency%2Cseverity&sysparm_limit=1sysparm_view=json_view"
}
connect_timeout => 1000000
request_timeout => 900000
socket_timeout => 180
user => "XXX"
password => "XXX"
schedule => { cron => "* * * * *"}
codec => "json"
metadata_target => "http_poller_metadata"
}
}
filter
{
split {
field => "records"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "servicenowinc"
}
stdout {
codec => rubydebug
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.