Problem with POST xml request with http_poller plugin

Hi ,

I am using logstash to send post data to elastic node but i am getting this.
Please help me regarding this.

logstash_post.conf

input {
http_poller {
urls => {
method => post
schedule => {cron => "* * * * * UTC"}
url => " here i keep my api url with token"
headers => {
Accept => "application/xml"
}
body => "<?xml version="1.0" encoding="UTF-8"?>.........
......................................."
}
request_timeout => 60
codec => "json"
metadata_target => "http_poller_metadata"
}
}

filter {}
output {
elasticsearch {
action => "index"
hosts => ["elasticsearch:9200"]
user => "username"
password => "password"
index => "logstash-sdp-req-post-%{+yyyy.MM.dd}"
}
}

error in terminal:

logstash_1 | [2020-05-26T12:45:15,141][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \t\r\n], "#", "{", "}" at line 10, column 31 (byte 321) after input {\n http_poller {\n urls => {\n method => post\n schedule => { cron => "* * * * * UTC"}\n url => "my rest api url with token "\n headers => {\n Accept => "application/xml"\n }\n body => "<?xml version="", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in block in compile_sources'", "org/jruby/RubyArray.java:2580:in map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:161:in initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:27:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:326:in block in converge_state'"]}

You have double quotes inside your double quoted string. Either escape them or use single quotes around the string

body => '<?xml version="1.0" encoding="UTF-8"?>.........
.......................................'

Thanks , Solved

Hi Badger,

I am not getting any response data in elastic search index. But when I hit url in postman then I got response data.

Is any way to get response data and index to elastic search through http polar plugin.

There is certainly a way. The http_poller exists in order to let people do that.

Hi Badger,

I wrote below conf file.

abc.conf:

input {
http_poller {
urls => {
q1 => {
method => post
url => "url with token"
headers => {
Accept => "application/xml"
}
body => '<?xml version="1.0" encoding="UTF-8"?><API version="1.0" locale="en
......................'
}
}
type => "allassets"
request_timeout => 60
codec => "json"
schedule => {cron => "* * * * * UTC"}
metadata_target => "http_poller_metadata"
}
}

filter {}

output {
if [type] == "allassets"{
elasticsearch {
action => "index"
hosts => ["elasticsearch:9200"]
user => "user"
password => "password"
index => "sdp-allassets-%{+yyyy.MM.dd}"
}
}
}
i did't get any response data in elastic search index.But when i keep same body parameter in postman then i get data with same api with token.

Please help me, if any correction needed.
Thanks

I would suggest looking at the logs of whatever service you are calling to see if that indicates why it is not providing a response.

While debugging i got this.Can you please tell me what i need to change in my conf file.

logstash_1 | [2020-05-29T19:22:00,908][WARN ][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"sdp-allassets-2020.05.29", :routing=>nil, :_type=>"_doc"}, #LogStash::Event:0x13efdddd], :response=>{"index"=>{"_index"=>"sdp-allassets-2020.05.29", "_type"=>"_doc", "_id"=>"ldPiYXIBS5TN_1CHT4Yi", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [API.response.operation.Details.field-values.record.value] of different type, current_type [long], merged_type [text]"}}}}

In the index you are writing to the field API.response.operation.Details.field-values.record.value has type long, possibly because in the first document that was indexed it was indeed a number. However, in that document the field is text. The answer is to create an index template that sets the type of that field before any documents are added.

If you have to add every field to the template that could be a considerable amount of work, but for now, just add the ones that cause a conflict, delete the index and index the documents a second time.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.