Filebeats - Multiple and different file types

(Jason) #1

I am trying to figure out how to deal with different types of log files using Filebeats as the the forwarder.

Basically I have several different log files I want to monitor, and then I actually want to put an extra field in to identify which log the entry came from, as well as a few other little things. This is then forwarded onto Logstash for further processing, which is where each element comes into play.

My problem is that it doesn't seem to play nicely once you add more than one file. Usually the last entry is the one that is uses. The documentation is confusing as well, in regards to how to achieve it, with document_type and input_type being interchanged.


List of prospectors to fetch data.

# Each - is a prospector. Below are the prospector specific configurations

    - "/www/sites/logs/dog.log"
  document_type: log
     type: dog
     generator: doglog
     server: myserver
    - "/www/sites/logs/cat.log"
  document_type: log
     type: cat
     generator: catlog
     server: myserver

(Mark Walkom) #2

Not sure if you meant to, but you posted this in the Community Announcements section, so I have moved it here for you.

(Steffen Siering) #3

So input_type is the input plugin type. Either use "stdin" (if you want to pipe data for filebeat) for "log" for log file input plugin.

The document_type is fully customizable. It's the final event['type'] field used to index data into elasticsearch.

For example:

- paths
    - "/www/sites/logs/dog.log"
  document_type: dog
- paths:
    - "/www/sites/logs/cat.log"
  document_type: cat

When indexing right into elasticsearch, all log lines will be written to same index (filebeat-<date>), but having different types. Based on 'type' you can filter in elasticsearch/kibana.

The fields configuration given in your example is another solution.

My problem is that it doesn't seem to play nicely once you add more than one file. Usually the last entry is the one that is uses

I don't understand. What's exactly the problem?

In logstash you can filter based on type or your custom fields. When indexing into Elasticsearch your custom fields will also be indexed. These you can use for filtering your entries. But configuring 'document_type' should be all you need. When setting up logstash according to the getting started guide, the document_type configuration in filebeat determines the document_type logstash uses for indexing.

(Jason) #4

Thanks Steffen,

It didn't seem to like having a conflict in document_type and type being defined. Was a little confusing, however have it working now, as described.

My initial issue was that it wasn't working, however changing the document_type to what we are filtering on in logstash seemed to do the trick.

Thanks again mate, greatly appreciated.

(system) #5