Filebeat input in Logstash is losing fields

(Eni Sinanaj) #1

I have the following infrastructure:

ELK installed as docker containers, each in its own container. And on a virtual machine running CentOS I installed nginx web server and Filebeat to collect the logs.
I enabled the nginx module in filebeat.

> filebeat modules enable nginx

Before starting filebeat I set it up with elasticsearch and installed it's dashboards on kibana.

config file (I have removed unnecessary comments from the file):

    filebeat.config.modules:
      path: ${path.config}/modules.d/*.yml
      reload.enabled: false
    
    setup.kibana:
      host: "172.17.0.1:5601"
    
    output.elasticsearch:
      hosts: ["172.17.0.1:9200"]

then to set it up in elasticsearch and kibana

filebeat setup -e --dashboards

This works fine. In fact if I keep it this way everything works perfectly. I can use the collected logs in kibana and use the dashboards for NGinX I installed with the above command.

I want though to pass the logs through to Logstash.
And here's my Logstash configuration uses the following pipelines:

    - pipeline.id: filebeat
      path.config: "config/filebeat.conf"

filebeat.conf:

input {
  beats {
    port => 5044
  }
}


#filter {
#  mutate {
#    add_tag => ["filebeat"]
#  }
#}


output {
  elasticsearch {
    hosts => ["elasticsearch0:9200"]
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
  }

  stdout { }
}

Making the logs go through Logstash the resulting log is just:

{
        "offset" => 6655,
      "@version" => "1",
    "@timestamp" => 2019-02-20T13:34:06.886Z,
       "message" => "10.0.2.2 - - [20/Feb/2019:08:33:58 -0500] \"GET / HTTP/1.1\" 304 0 \"-\" \"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/71.0.3578.98 Chrome/71.0.3578.98 Safari/537.36\" \"-\"",
          "beat" => {
         "version" => "6.5.4",
            "name" => "localhost.localdomain",
        "hostname" => "localhost.localdomain"
    },
        "source" => "/var/log/nginx/access.log",
          "host" => {
                   "os" => {
             "version" => "7 (Core)",
            "codename" => "Core",
              "family" => "redhat",
            "platform" => "centos"
        },
                 "name" => "localhost.localdomain",
                   "id" => "18e7cb2506624fb6ae2dc3891d5d7172",
        "containerized" => true,
         "architecture" => "x86_64"
    },
       "fileset" => {
          "name" => "access",
        "module" => "nginx"
    },
          "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
         "input" => {
        "type" => "log"
    },
    "prospector" => {
        "type" => "log"
    }
}

A lot of fields are missing from my object. There should have been many more structured information

{
  "_index": "filebeat-6.5.4-2019.02.20",
  "_type": "doc",
  "_id": "ssJPC2kBLsya0HU-3uwW",
  "_version": 1,
  "_score": null,
  "_source": {
    "offset": 9639,
    "nginx": {
      "access": {
        "referrer": "-",
        "response_code": "404",
        "remote_ip": "10.0.2.2",
        "method": "GET",
        "user_name": "-",
        "http_version": "1.1",
        "body_sent": {
          "bytes": "3650"
        },
        "remote_ip_list": [
          "10.0.2.2"
        ],
        "url": "/access",
        "user_agent": {
          "patch": "3578",
          "original": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/71.0.3578.98 Chrome/71.0.3578.98 Safari/537.36",
          "major": "71",
          "minor": "0",
          "os": "Ubuntu",
          "name": "Chromium",
          "os_name": "Ubuntu",
          "device": "Other"
        }
      }
    },
    "prospector": {
      "type": "log"
    },
    "read_timestamp": "2019-02-20T14:29:36.393Z",
    "source": "/var/log/nginx/access.log",
    "fileset": {
      "module": "nginx",
      "name": "access"
    },
    "input": {
      "type": "log"
    },
    "@timestamp": "2019-02-20T14:29:32.000Z",
    "host": {
      "os": {
        "codename": "Core",
        "family": "redhat",
        "version": "7 (Core)",
        "platform": "centos"
      },
      "containerized": true,
      "name": "localhost.localdomain",
      "id": "18e7cb2506624fb6ae2dc3891d5d7172",
      "architecture": "x86_64"
    },
    "beat": {
      "hostname": "localhost.localdomain",
      "name": "localhost.localdomain",
      "version": "6.5.4"
    }
  },
  "fields": {
    "@timestamp": [
      "2019-02-20T14:29:32.000Z"
    ]
  },
  "sort": [
    1550672972000
  ]
}
(Chris) #2

Have you loaded the ingest pipelines manually as described here:

https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-modules-quickstart.html#load-ingest-pipelines

(Eni Sinanaj) #3

Nope that I haven't done. I'm using version 6.5.4 of ELK (and beats).

I had installed the same spolution in my local machine and everything worked as expected. that is what doesn't add up for me...

(Eni Sinanaj) #4

I mean, it is strange that logstash is sending to elasticsearch a stripped out version of what filebeat is shipping. because filebeat sends everything well structured to elasticsearch if that is set as the output.

(Chris) #5

To do this, filebeat needs to have a elasticsearch output in its configuration. You will have to:

  1. Stop Filebeat.
  2. Comment out logstash output, uncomment and configure the elasticsearch output.
  3. Load the ingest pipeline as mentioned in the link.
  4. Restore the logstash output in the config file and comment out the elasticsearch output. Start filebeat again.
(Eni Sinanaj) #6

Nice I'll try this tomorrow. Does this work the same with auditbeat as well? Actually I have another issue with auditbeat. It's sending all possible fields as expected but the dashboards it has installed don't find the fields they need. strange.

(system) closed #7

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.