How to send data to a http end point using LS output plugin

It is true that request that are responded with a 200 will not produce any log.
Can you try using a HTTP sniffer like wireshark or Fiddler to check what are the request/response that are send/received by logstash, it seems the only way

1 Like

I tried this again after re writing the conf file from scratch
which is think more concrete based on your inputs

now i am getting this error

{:timestamp=>"2016-08-30T23:00:43.473000+0530", :message=>"[HTTP Output Failure] Encountered non-200 HTTP code 200", :response_code=>500, :url=>"", :event=>#<LogStash::Event:0x11992019 @metadata_accessors=#<LogStash::Util::Accessors:0x663dcd5b @store={}, @lut={}>, @cancelled=false, @data={"message"=>"New low cost houses for middle-class family.\nHouse according to average income salary\n", "@version"=>"1", "@timestamp"=>"2016-08-28T16:30:14.000Z", "received"=>"by with HTTP; Sun, 28 Aug 2016 09:30:14 -0700", "date"=>"Sun, 28 Aug 2016 22:00:14 +0530", "from"=>"User Lake <>", "to"=>"User Lake <>", "message-id"=>"<>", "subject"=>"New Houses for Middle Classes", "mime-version"=>"1.0", "content-type"=>"multipart/alternative; boundary=001a114f37ea2401b5053b244355", "delivered-to"=>""}, @metadata={}, @accessors=#<LogStash::Util::Accessors:0x3a1506a9 @store={"message"=>"New low cost houses for middle-class family.\nHouse according to average income salary\n", "@version"=>"1", "@timestamp"=>"2016-08-28T16:30:14.000Z", "received"=>"by with HTTP; Sun, 28 Aug 2016 09:30:14 -0700", "date"=>"Sun, 28 Aug 2016 22:00:14 +0530", "from"=>"User Lake <>", "to"=>"User Lake <>",

Sadly the plugin logging is incomplete and will not give you the Soap Fault that is in the response body...
Do you have access to the SOAP server logs, otherwise you need Wireshark to know what is the content of this response

On the other hand, do you have several config files in your running logstash, because this event is clearly an event created by the imap input, and your config above only talk about reading a csv file.
So I suspect you have 2 differents config files. You must understand that all config files are concatenated into a single one to create a single pipeline, so all inputs sends to the single filtering pipeline and then go to all outputs. Unless you specify conditions with if statements, is it the case ?

Hey @wiibaa

Want to share a good news with you, i have cracked SOAP with http output plugin.
Thanks to you, who asked me to work on HTTP sniffer, that has helped me to crack this

I have used same stuff what you told above.
The reason i was not getting is the message command with single line xml and whitespaces means it will not work
I used Notepad++ to trim trailing white spaces and Join line feature of Notepad++ to make it perfect.
and used HTTP sniffer to monitor where it is going wrong

I think you can claim proudly that http output plugin is meant for SOAP

Thanks to you.

Big hug


Happy for you :slight_smile:

I will soon register a github issue with documentation enhancement from what we discuss, can you please close the issues you opened there.

I have Closed the issue in github.

Hi @wiibaa.

If required i can start a new thread. But this is something related
Last week i tried csv--> HTTP Output

This time i am trying get information from HTTP End Point --> Post to Elastic search

For this i tried "http input plug" with this i am not able to define URL and message

Second option i tried is "http_poller input plugin" with this i am getting bad response and ES index fields are all related to website statistics

Please let me know if you have any idea how to achieve reverse wherein output will beElastic index
and input will be the Endpoint SOAP url

Hi again @rkhapre,

can you please tell me more about your idea. Do you want to

1 => Call logstash as it was a SOAP endpoint by sending it a XML message to the endpoint defined by the http plugin host/port config


2 => Have logstash call a SOAP endpoint periodically to retrieve data with http_poller

Can you then provide an example of message, XML I suppose, that need to be processed in Logstash

hi @wiibaa

Here is the scenario that i am trying

  1. In my previous post, my input was csv file wherein i had 3 fields Name, Description, Age

  2. After using SOAP end point with logstash http output plugin, i successfully posted the 3 fields

  3. Now i want to retrieve same 3 fields from application and post it to Elastic Search for analysis

For this also i am planning to use SOAP Request. I have a endpoint url and findservice.
In Logstash my input will be HTTP input OR http_poller and output will be Elasticsearch index

It is working well in SOAP UI.

Below are the headers in SOAP U that i am gettingI, i am using find service to retrive the list of users

Accept-Encoding: gzip,deflate
Content-Type: text/xml;charset=UTF-8
SOAPAction: ""
Content-Length: 2110
Connection: Keep-Alive
User-Agent: Apache-HttpClient/4.1.1 (java 1.5)
Authorization: Basic a2V2aW4uc2Nob3R0OmpKZDQ0NTc3

Also i have a XML file like the one we had above for create user and we used that one in message=>.

This list of user service i want to see in Kibana

Hi @wiibaa

Just looking forward for some tips from you, if this is possible with http input/ http_poller or i will have to look on other plugins

Sorry but I did not have time to test it out as I would like.

As you want to query from logstash your SOAP server, http_poller is definitely the plugin to use. But I'm not familiar with it and there is no example of a POST request, so you are on your own for the moment.

I will keep you posted when I have time to make a solid test, hopefully soon

Thanks for your effort and update. I will explore in next 2 days and if successful then will post here

@rkhapre just in case you did not find a solution already,

I admit that the available documentation does not make it direct on how to execute a POST request with body.
The http_poller says that it supports all options from Manticore::Client but I could not find a complete POST request example there neither.

Here is my configuration against the first online SOAP endpoint I found online:

input {
  http_poller {
    urls => {
      soap_request => {
        method => post
        url => ""
        headers => {
          "Content-Type" => "text/xml; charset=utf-8"
        body => '<?xml version="1.0" encoding="utf-8"?>
          <soap:Envelope xmlns:xsi="" xmlns:xsd="" xmlns:soap="">
              <GetCitiesByCountry xmlns="http://www.webserviceX.NET">
        auth => {
          user => "not_used_here"
          password => "not_used_here"
    request_timeout => 60
    interval => 60
    codec => "plain"
1 Like

Hi @wiibaa

Sorry for late reply

This is working perfectly as you stated above. I was missing the "body=>"

But i see there are two issue

  1. If you open the logstash command running then it is keep loading the records in Elasticsearch.
    This records are duplicate, but it is getting loaded in ES because of @timestamp field.
    Any ways to do only incremental load?

2.The data getting loaded in Elastic Search is in XML response, i thought it will split the xml tags and load the response like each xml tag as field in ES.

I think i will have to use the XML filter plugin to define add fields for each xml tag

Hi @wiibaa
If the xml response holds 10 records, how i will split into 10 different records.
Currently all the 10 records are coming as 1 record in the form of "message" in ES as this is xml response.

I tried using codec as "json" but it failed

Hi @wiibaa

Is there any way, i can break the SOAP response which is a xml output to a json format where each xml tag will act as a column in the Elastic Search/Kibana

Hi again, sorry for being away

we should be able to get the XML into the event with the xml filter indeed, then if you need to split records, the split filter should help.
Can you post a sample request message and I can have a look

the duplicate handling between 2 executions remains a complex issue. one raw idea would be to either use the elasticsearch filter to check if your records are already in, or use upsert mode of elasticsearch output with specific document_Id

Hi @wiibaa

Sorry for late reply, i was away. But now i have to solve this

Here the sample response that i get. Here i am getting 2 records, i will have to split this into 2 records that should go into ES

<env:Envelope xmlns:env="" 
xmlns:ns1="" xmlns:tns="" 
xmlns:ns0="" xmlns:xsi="" 
<ns1:UserId>300000130595047</ns1:UserId><ns1:Name>Dan Billo</ns1:Name></ns1:Value>

<ns1:UserId>300000076703621</ns1:UserId><ns1:Name>Rama Re Rama</ns1:Name></ns1:Value>


Seems similar to How to split xml arrays?

here is my quick try at it, I have put your XML on one-line for testing with stdin, but should work the same as is, tested with LS 2.4 but should work as long as you have the latest xml filter in your install.

input {
  stdin { 
filter {
    # brutal extract of the ns1:Value element from the soap message
    xml {
           source => "message"
           target => "xmldata"
           store_xml => "false"
           remove_namespaces => true
           xpath => ["//Value","value"]
           remove_field => "message"
   # Split the array in several events
   split {
     field => "value"
   # Parse the remaining XML string to populate the event with fields
   xml {
     source => "value"
     target => "json-value"
     force_array => false
     remove_field => "value"
output {
  stdout { codec => rubydebug}

the output looks like

      "@version" => "1",
    "@timestamp" => "2016-10-14T03:54:17.324Z",
          "host" => "debian",
    "json-value" => {
           "CreatedBy" => "HARRY.BAKER",
        "CreationDate" => "2016-10-03T07:10:02.154Z",
              "UserId" => "300000130595047",
                "Name" => "Dan Billo"
      "@version" => "1",
    "@timestamp" => "2016-10-14T03:54:17.324Z",
          "host" => "debian",
    "json-value" => {
           "CreatedBy" => "STEVE.BECK",
        "CreationDate" => "2014-07-19T17:41:31.422Z",
              "UserId" => "300000076703621",
                "Name" => "Rama Re Rama"