Can we fill up some variable dynamically using jSON field?

Hi there,

In my system, we have several log types/levels that are based on the name "log_type", e.g: "python_log", "cache_log", ... that are coming to the single log file "mylog.log"

I am wondering to do something like:

- paths:
  - "/var/log/mysystem/*.json.log"
  type: log
  json.keys_under_root: true
  json.add_error_key: true
  overwrite_keys: true
  fields_under_root: true
    type: "${var_from_my_json_log}_logs"

Has there some way to do that? any clue?

ps: I am working an POC to replace our current heka setup.

@pierhugues, @daved Do you guys have some clue?

So you want to set the type field dynamically based on a field from the JSON object?

With Filebeat alone there isn't way to accomplish this because there is no processor for mutating the data (e.g. copy some field value to type and append _logs to the value).

You should be able to do this with an ingest node pipeline in Elasticsearch or with Logstash.

1 Like

Thank you @andrewkroh, so can we assume that the ES-ingest mechanism is faster than Logstash?

I wouldn't assuming anything about performance without testing.

Setting up ingest node to do this will probably be simpler since I assume you are already delivering the data to ES. So you only need to PUT a pipeline and add pipeline to your prosector config.


Hi @andrewkroh, Thank you for your time! I owe you some beer! :slight_smile:

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.