Can we fill up some variable dynamically using jSON field?


(Jorge Pereira) #1

Hi there,

In my system, we have several log types/levels that are based on the name "log_type", e.g: "python_log", "cache_log", ... that are coming to the single log file "mylog.log"

I am wondering to do something like:

filebeat.prospectors:
- paths:
  - "/var/log/mysystem/*.json.log"
  type: log
  json.keys_under_root: true
  json.add_error_key: true
  overwrite_keys: true
  fields_under_root: true
  fields:
    type: "${var_from_my_json_log}_logs"

Has there some way to do that? any clue?

ps: I am working an POC to replace our current heka setup.


(Jorge Pereira) #2

@pierhugues, @daved Do you guys have some clue?


(Andrew Kroh) #3

So you want to set the type field dynamically based on a field from the JSON object?

With Filebeat alone there isn't way to accomplish this because there is no processor for mutating the data (e.g. copy some field value to type and append _logs to the value).

You should be able to do this with an ingest node pipeline in Elasticsearch or with Logstash.


(Jorge Pereira) #4

Thank you @andrewkroh, so can we assume that the ES-ingest mechanism is faster than Logstash?


(Andrew Kroh) #5

I wouldn't assuming anything about performance without testing.

Setting up ingest node to do this will probably be simpler since I assume you are already delivering the data to ES. So you only need to PUT a pipeline and add pipeline to your prosector config.


(Jorge Pereira) #6

Hi @andrewkroh, Thank you for your time! I owe you some beer! :slight_smile:


(system) #7

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.