How to configure different types of logs


I want to configure two different logs in FIlebeat. I have system logs and JSON logs from my application. Can I configure both these in a single config file? If yes, what would be the configuration required for it?
In the filebeat.yml I have below configuration:

    # Each - is a prospector. Below are the prospector specific configurations
      input_type: log
      # Paths that should be crawled and fetched. Glob based paths.
      # For each file found under this path, a harvester is started.
        - /home/Downloads/logs/log5.log
      # Type to be published in the 'type' field. For Elasticsearch output,
      # the type defines the document type these entries should be stored
      # in. Default: log
      document_type: applog
        message_key: log
        keys_under_root: true
        overwrite_keys: true
      input_type: log
        - /home/Downloads/logs_sys/log1.log
      document_type: systemlog

(ruflin) #2

Yes, you can configure both in one config file. Your config looks ok. About the mapping: That depends on your json. I assume the reason you posted it, is that something is not working as expected?


Yes. In app logs I have nested JSON objects while in the sys logs I have string in messages. IS there a way to specify a mapping where message can accept both string and JSON object.

I am getting below error in filebeat logs:
WARN Can not index event (status=400): {"type":"mapper_parsing_exception","reason":"object mapping for [message] tried to parse field [message] as object, but found a concrete value"}

Also, if I have JSON logs do I need to use Filebeat 5.0.0-alpha3 only or can I also use Filebeat1.2 for it?

(ruflin) #4

For the mapping: Elasticsearch can only have one mapping type for a single field. What you can do in your case is use Logstash to route the data to two difference indices.

JSON support is only available in the 5.0 releases.

(system) #5

This topic was automatically closed after 21 days. New replies are no longer allowed.