HTTP NLog to Logstash Not Parsing Right


(Hector Mendoza) #1

I have a similar issue than Empty result when using TCP input to collect NLog JSON events and no one posted a response to that (automatically got closed after 28 days).

I captured the output that NLog is sending to ElasticSearch. It’s sending HTTP POST data as a bulk insert. That’s a REST API call used to do bulk insert to ElasticSearch. I captured the payload that is sent and it’s sent as multiple separate JSON objects. That’s apparently known as NDJSON.

In LogStash we’re capturing the input as HTTP. The problem we’re having currently is that when we read the input and output that to a file, we only get the first JSON object and not the subsequent ones.

It works as expected if the input filter is a file, but not if it is HTTP.

Here is the configurations we are using

Test 1: NDJSON input from a file, output to a file. Input codec JSON, output codec rubydebug

input {

http {

port => 5046

codec => json

}

}

filter {

}

output {

file {

codec => rubydebug

path => "/var/log/temptest2.log"

}

}

Result: The full NDJSON gets written to the output file.

Test 2: NDJSON input from HTTP, output to a file. Input codec JSON, output codec rubydebug

I send the NDJSON file data with a cURL command: curl -s -H "Content-Type: application/json" -XPOST localhost:5046/_bulk --data-binary "@bulk.txt"

That’s the same file I used for the file input in Test 1.

input {

http {

port => 5046

codec => json

}

}

filter {

}

output {

file {

codec => rubydebug

path => "/var/log/temptest3.log"

}

}

Result: Only the first JSON object gets written to the file.

So it appears that the issue is with the HTTP input filter not processing the input correctly. I’ve tried multiple different codecs (json, json_lines,multi_line,es_bulk) (es_bulk seemed the most promising).

Can anyone please help?


(system) #2

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.