Message Field missing

Hello, I've been using Logstash to parse my IIS logs for some time now. I decided to start using Filebeat to send logs directly to Logstash to make things easier. My problem is that the message field appears to be missing once the data gets to Logstash for Filebeat.

Here's my setup
Filebeat 5.2.2 on Windows 10
Logstash 2.4.0 on Ubuntu

Create a test log with single lines of random text. Here's my Filebeat config
filebeat.prospectors:

- input_type: log

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - c:\admin\logs\ELKTest\Test\*.log
    encoding: utf8
output.logstash:
  # The Logstash hosts
  hosts: ["LOGSTASH:5044"]

Here's my stripped down Logstash config

input {
  beats {
    port => 5044
  }
}
output {
  file {
    path => "/var/log/logstash/beats-%{+YYYY-MM-dd}.txt"
    codec => "rubydebug"
  }
}

Here's what happens when I run filebeat with full debug mode (filebeat.exe -e -d "publish") and add a test line to my log file

2017/03/13 21:26:45.445606 sync.go:68: DBG  Events sent: 4
2017/03/13 21:27:00.447352 client.go:184: DBG  Publish: {
  "@timestamp": "2017-03-13T21:27:00.319Z",
  "beat": {
    "hostname": "TEST-LAPTOP",
    "name": "TEST-LAPTOP",
    "version": "5.2.2"
  },
  "input_type": "log",
  "message": "ADDING A NEW LINE",
  "offset": 56,
  "source": "c:\\admin\\logs\\ELKTest\\Test\\wow.log",
  "type": "log"
}
2017/03/13 21:27:00.447352 output.go:109: DBG  output worker: publish 1 events
2017/03/13 21:27:22.453357 sync.go:68: DBG  Events sent: 1

And I see the following on the Logstash server (Reading the output file)

{
      "@version" => "1",
    "@timestamp" => "2017-03-13T21:27:00.319Z",
          "type" => "log",
    "input_type" => "log",
          "beat" => {
         "version" => "5.2.2",
            "name" => "TEST-LAPTOP",
        "hostname" => "TEST-LAPTOP"
    },
        "source" => "c:\\admin\\logs\\ELKTest\\Test\\wow.log",
        "offset" => 56,
          "host" => "TEST-LAPTOP",
          "tags" => [
        [0] "beats_input_codec_plain_applied"
    ]
}

As you can see, the Logstash server does not have the "message" field.

What am I missing here? I'm hoping it's a config issue on the Filebeat side.

Thanks

unless configured to do so, filebeat is not removing fields from events to be published. Have you tried with LS 5.2 ?

@steffens, thanks for the feedback. I went ahead and upgraded logstash to 5.2.2 and I still see the same results. This is my first time working with filebeats, would there be any reason why the message field is not passed to logstash? I even tried sending to redis, and saw the same results

Filebeat

2017/03/14 18:01:00.248690 client.go:184: DBG  Publish: {
  "@timestamp": "2017-03-14T18:01:00.246Z",
  "beat": {
    "hostname": "TEST-LAPTOP",
    "name": "TEST-LAPTOP",
    "version": "5.2.2"
  },
  "input_type": "log",
  "message": "This is a test IP 10.0.1.20 443",
  "offset": 45,
  "source": "c:\\admin\\logs\\ELKTest\\Test\\newtestlog.log",
  "type": "log"
}
2017/03/14 18:01:00.248690 output.go:109: DBG  output worker: publish 1 events
2017/03/14 18:01:00.360773 sync.go:68: DBG  Events sent: 1
2017/03/14 18:01:15.207811 logp.go:230: INFO Non-zero metrics in the last 30s: libbeat.logstash.publish.write_bytes=248 libbeat.publisher.published_events=1 registrar.writes=1 libbeat.logstash.published_and_acked_events=1 publish.events=1 libbeat.logstash.call_count.PublishEvents=1 registrar.states.update=1 libbeat.logstash.publish.read_bytes=6

Logstash config file

input {
  beats {
    port => 5044
  }
}

output {
  file {
    path => "/var/log/logstash/beats-%{+YYYY-MM-dd}.txt"
    codec => "rubydebug"
  }
}

Output file

{
    "@timestamp" => 2017-03-14T18:01:00.246Z,
        "offset" => 45,
      "@version" => "1",
          "beat" => {
        "hostname" => "TEST-LAPTOP",
            "name" => "TEST-LAPTOP",
         "version" => "5.2.2"
    },
    "input_type" => "log",
          "host" => "TEST-LAPTOP",
        "source" => "c:\\admin\\logs\\ELKTest\\Test\\newtestlog.log",
          "type" => "log",
          "tags" => [
        [0] "beats_input_codec_plain_applied"
    ]
}

If I try adding any grok filters to the logstash config, I get the grokparsefailure tag

Finally figure it out. I knew it was something that I was missing. For some reason I had thought that logstash only includes *.conf files, so I had some inactive test files named 10-test.conf.test. Turns out, that file was being loaded, and since it starts with 10-, it was loaded first. This had a large filter section with a drop_field clause at the bottom. For some reason, I had the drop_field declaration outside of the type match.

All good now

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.