Winlogbeat and kafka output

Hi,

I'm messing with a new dev elk cluster:

Winlogbeat 5.3
Kafka_2.10-0.10.2.0
Logstash 5.3

Our old one is a pre BEAT cluster, and now we are trying to use the beat packages as much as possible..

One thing I encounter with the kafka setup, is that the fields isn't populated like when using the beat input in logstash.

I don't know if this is expected behavior or if I'm doing something wrong :slight_smile:

winlogbeat -> kafka -> logstash

  "message" => "{\n  \"@timestamp\": \"2017-04-12T17:10:25.880Z\",\n  \"beat\": {\n    \"hostname\": \"tnk-minion2012\",\n    \"name\": \"tnk-minion2012\",\n    \"version\": \"5.3.0\"\n  },\n  \"computer_name\": \"tnk-minion2012.example.com\",\n  \"event_data\": {\n    \"AlertDesc\": \"10\",\n    \"ErrorState\": \"1203\"\n  },\n  \"event_id\": 36888,\n  \"level\": \"Error\",\n  \"log_name\": \"System\",\n  \"message\": \"A fatal alert was generated and sent to the remote endpoint. This may result in termination of the connection. The TLS protocol defined fatal error code is 10. The Windows SChannel error state is 1203.\",\n  \"opcode\": \"Info\",\n  \"process_id\": 488,\n  \"provider_guid\": \"{00000000-0000-0000-0000-000000000}\",\n  \"record_number\": \"13917\",\n  \"source_name\": \"Schannel\",\n  \"thread_id\": 1548,\n  \"type\": \"wineventlog\",\n  \"user\": {\n    \"domain\": \"NT AUTHORITY\",\n    \"identifier\": \"S-1-5-18\",\n    \"name\": \"SYSTEM\",\n    \"type\": \"User\"\n  }\n}"
}

winlogbeat -> logstash

{
    "computer_name" => "tnk-minion2012.example.com",
         "keywords" => [
        [0] "Classic"
    ],
            "level" => "Warning",
         "log_name" => "System",
    "record_number" => "12181",
       "event_data" => {
        "Binary" => "04000400000000000000000000000000000000000200C0"
    },
          "message" => "{Delayed Write Failed} Windows was unable to save all the data for the file . The data has been lost. This error may be caused by a failure of your computer hardware or network connection. Please try to save this file elsewhere.",
             "type" => "wineventlog",
             "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
       "@timestamp" => 2017-01-08T22:05:50.804Z,
         "event_id" => 50,
         "@version" => "1",
             "beat" => {
        "hostname" => "tnk-minion2012",
            "name" => "tnk-minion2012",
         "version" => "5.3.0"
    },
             "host" => "tnk-minion2012",
      "source_name" => "Ntfs"
}

Winlogbeat kafka output:

output.kafka:
  hosts: ["10.10.10.10:9092", "10.10.10.11:9092"]
  topic: '%{[type]}'
  partition.round_robin:
    reachable_only: false
  required_acks: 1
  compression: gzip
  max_message_bytes: 1000000

--
Regards Falk

Setting codec => "json" on your kafka {} input should help. The data from Beats is sent to Kafka as a JSON object.

Hi,

Really thanks for the fast answer!
First thing this morning I tried it out, and it worked "like a glove" :slight_smile:

Thanks

--
Regards Falk

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.