How to send data to a http end point using LS output plugin

Sorry but I did not have time to test it out as I would like.

As you want to query from logstash your SOAP server, http_poller is definitely the plugin to use. But I'm not familiar with it and there is no example of a POST request, so you are on your own for the moment.

I will keep you posted when I have time to make a solid test, hopefully soon

Thanks for your effort and update. I will explore in next 2 days and if successful then will post here

@rkhapre just in case you did not find a solution already,

I admit that the available documentation does not make it direct on how to execute a POST request with body.
The http_poller says that it supports all options from Manticore::Client but I could not find a complete POST request example there neither.

Here is my configuration against the first online SOAP endpoint I found online:

input {
  http_poller {
    urls => {
      soap_request => {
        method => post
        url => "http://www.webservicex.net/globalweather.asmx"
        headers => {
          "Content-Type" => "text/xml; charset=utf-8"
        }
        body => '<?xml version="1.0" encoding="utf-8"?>
          <soap:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
            <soap:Body>
              <GetCitiesByCountry xmlns="http://www.webserviceX.NET">
                <CountryName>France</CountryName>
              </GetCitiesByCountry>
            </soap:Body>
          </soap:Envelope>'
        auth => {
          user => "not_used_here"
          password => "not_used_here"
        }
      }
    }
    request_timeout => 60
    interval => 60
    codec => "plain"
  }
}
1 Like

Hi @wiibaa

Sorry for late reply

This is working perfectly as you stated above. I was missing the "body=>"

But i see there are two issue

  1. If you open the logstash command running then it is keep loading the records in Elasticsearch.
    This records are duplicate, but it is getting loaded in ES because of @timestamp field.
    Any ways to do only incremental load?

2.The data getting loaded in Elastic Search is in XML response, i thought it will split the xml tags and load the response like each xml tag as field in ES.

I think i will have to use the XML filter plugin to define add fields for each xml tag

Hi @wiibaa
If the xml response holds 10 records, how i will split into 10 different records.
Currently all the 10 records are coming as 1 record in the form of "message" in ES as this is xml response.

I tried using codec as "json" but it failed

Hi @wiibaa

Is there any way, i can break the SOAP response which is a xml output to a json format where each xml tag will act as a column in the Elastic Search/Kibana

Hi again, sorry for being away

we should be able to get the XML into the event with the xml filter indeed, then if you need to split records, the split filter should help.
Can you post a sample request message and I can have a look

the duplicate handling between 2 executions remains a complex issue. one raw idea would be to either use the elasticsearch filter to check if your records are already in, or use upsert mode of elasticsearch output with specific document_Id

Hi @wiibaa

Sorry for late reply, i was away. But now i have to solve this

Here the sample response that i get. Here i am getting 2 records, i will have to split this into 2 records that should go into ES

<env:Envelope xmlns:env="http://schemas.xmlsoap.org/soap/envelope/" 
xmlns:wsa="http://www.w3.org/2005/08/addressing"><env:Header><wsa:Action>http://xmlns.xyz.com/apps/abc/sdfg/user/userService/finduserFinduserByNameResponse</wsa:Action><wsa:MessageID>urn:uuid:94305c8b-8230-4f4c-99c1-b76aff5a0345</wsa:MessageID></env:
Header><env:Body><ns0:findUserFindUserByNameResponse 
xmlns:ns0="http://xmlns.xyz.com/apps/abc/sdfg/user/userService/types/"><ns2:result 
xmlns:ns2="http://xmlns.xyz.com/apps/abc/sdfg/user/userService/types/" 
xmlns:ns1="http://xmlns.xyz.com/apps/abc/sdfg/user/userService/" xmlns:tns="http://xmlns.xyz.com/adf/svc/errors/" 
xmlns:ns0="http://xmlns.xyz.com/adf/svc/types/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:type="ns1:UserResult">
<ns1:Value><ns1:CreatedBy>HARRY.BAKER</ns1:CreatedBy><ns1:CreationDate>2016-10-03T07:10:02.154Z</ns1:CreationDate>
<ns1:UserId>300000130595047</ns1:UserId><ns1:Name>Dan Billo</ns1:Name></ns1:Value>

<ns1:Value><ns1:CreatedBy>STEVE.BECK</ns1:CreatedBy><ns1:CreationDate>2014-07-19T17:41:31.422Z</ns1:CreationDate>
<ns1:UserId>300000076703621</ns1:UserId><ns1:Name>Rama Re Rama</ns1:Name></ns1:Value>

</ns2:result></ns0:findIdeaFindUserByName
Response></env:Body></env:Envelope>

Seems similar to How to split xml arrays?

here is my quick try at it, I have put your XML on one-line for testing with stdin, but should work the same as is, tested with LS 2.4 but should work as long as you have the latest xml filter in your install.

input {
  stdin { 
  }
}
filter {
    # brutal extract of the ns1:Value element from the soap message
    xml {
           source => "message"
           target => "xmldata"
           store_xml => "false"
           remove_namespaces => true
           xpath => ["//Value","value"]
           remove_field => "message"
   }
   # Split the array in several events
   split {
     field => "value"
   }
   # Parse the remaining XML string to populate the event with fields
   xml {
     source => "value"
     target => "json-value"
     force_array => false
     remove_field => "value"
   }
}
output {
  stdout { codec => rubydebug}
}

the output looks like

{
      "@version" => "1",
    "@timestamp" => "2016-10-14T03:54:17.324Z",
          "host" => "debian",
    "json-value" => {
           "CreatedBy" => "HARRY.BAKER",
        "CreationDate" => "2016-10-03T07:10:02.154Z",
              "UserId" => "300000130595047",
                "Name" => "Dan Billo"
    }
}
{
      "@version" => "1",
    "@timestamp" => "2016-10-14T03:54:17.324Z",
          "host" => "debian",
    "json-value" => {
           "CreatedBy" => "STEVE.BECK",
        "CreationDate" => "2014-07-19T17:41:31.422Z",
              "UserId" => "300000076703621",
                "Name" => "Rama Re Rama"
    }
}

HTH