[solved] Nxlog -> logstash -> elasticsearch

Update: Really was a misunderstanding. Needed codec => rubydebug in my stdout output to see what I was expecting


Of course this was all working but I upgraded -- sigh.

ubuntu 14.04
nxlog-ce 2.7.1191
logstash 2.3.2
elasticsearch 2.3.2

Symptom:

Logstash output not outputting json:

/opt/logstash/bin/logstash --config /etc/logstash/conf.d/
Settings: Default pipeline workers: 8
Pipeline main started
2016-05-17T20:53:43.798Z 127.0.0.1 %{message}
2016-05-17T20:53:57.498Z 127.0.0.1 %{message}
2016-05-17T20:54:01.043Z 127.0.0.1 %{message}

Here is my config:

input {
  tcp {
    port => 5140
    codec => json_lines { }
  }
}
filter { }
output {
  stdout { }
}

I did a netcat on port 5140 to make sure nxlog was sending Json:

# nc -l localhost 5140
{"MessageSourceAddress":"192.168.153.59","EventReceivedTime":"2016-05-17 16:08:36","SourceModuleName":"in_tcp","SourceModuleType":"im_tcp","SyslogFacilityValue":4,"SyslogFacility":"AUTH","SyslogSeverityValue":6,"SyslogSeverity":"INFO","SeverityValue":2,"Severity":"INFO","Hostname":"server09","EventTime":"2016-05-17 16:08:36","SourceName":"sshd","ProcessID":"10099","Message":"Received disconnect from 192.168.153.33: 3: com.jcraft.jsch.JSchException: Auth cancel [preauth]"}
{"MessageSourceAddress":"192.168.153.59","EventReceivedTime":"2016-05-17 16:08:36","SourceModuleName":"in_tcp","SourceModuleType":"im_tcp","SyslogFacilityValue":4,"SyslogFacility":"AUTH","SyslogSeverityValue":6,"SyslogSeverity":"INFO","SeverityValue":2,"Severity":"INFO","Hostname":"server09","EventTime":"2016-05-17 16:08:36","SourceName":"sshd","ProcessID":"10098","Message":"Received disconnect from 192.168.153.33: 3: com.jcraft.jsch.JSchException: Auth cancel [preauth]"}

Any pointers? Thanks. -- Bud

I don't know nxlog, but why not use winlogbeat?

I'd be surprised if your configuration ever emitted JSON. The default codec of the stdout output has AFAIK always been plain. If you want the stdout plugin to serialize the events into JSON you need to use the json_lines codec there too.

1 Like

nxlog is on the Linux box with logstash to handle syslog messages from
network devices and other Linux boxes. nxlog provides some buffering in
front of logstash and translates the syslog messages to json.

I know there is/was another use case for nxlog to run n Windows to send
event logs as syslog but that isn't what I am using it for.

Thanks for looking at this though. -- Bud

Yeah, I thought when I originally set this up, json would get written to stdout but I could be mis-remembering the json part.

Ultimately, what I am trying to achieve, is to send all the fields in my json (i.e. from the netcat above) to get shuttled into a json doc in elasticseach:

output {
  stdout { }
  elasticsearch { 
      hosts => localhost 
      template_overwrite => true
  }
}

That doesn't seem to be happening, hence adding the stdout { } to see what was going on. Seeing %{message}, rather than the actual fields of the message, written to stdout seems wrong...

Argh!

output {
  stdout {codec => rubydebug}
}

produces the json I remember. So my previous example produces:

{
    "MessageSourceAddress" => "192.168.153.59",
       "EventReceivedTime" => "2016-05-17 16:08:36",
        "SourceModuleName" => "in_tcp",
        "SourceModuleType" => "im_tcp",
     "SyslogFacilityValue" => 4,
          "SyslogFacility" => "AUTH",
     "SyslogSeverityValue" => 6,
          "SyslogSeverity" => "INFO",
           "SeverityValue" => 2,
                "Severity" => "INFO",
                "Hostname" => "server09",
               "EventTime" => "2016-05-17 16:08:36",
              "SourceName" => "sshd",
               "ProcessID" => "10099",
                "@version" => "1",
              "@timestamp" => "2016-05-18T16:05:12.585Z",
                    "host" => "monitor01",
                 "message" => "Received disconnect from 192.168.153.33: 3: com.jcraft.jsch.JSchException: Auth cancel [preauth]"
}

Which seems reasonable... Off to figure out why it isn't getting into elasticsearch...