Source file path transfer to Logstash


I'm not new with ELK stuck already. But I did not use all functionality of ELK.
i have a question for community about Filebeat v1.3 configuration.

I need to transfer Source file location(path) as a field to Logstash for after treatment.
I have read about "fields" option FileBeat v1.3 Field_option, but I'm not able to figure out how to do this.

Here is the configuration of filebeat:

      - /var/www/dbreports/DAILY/
  input_type: report
  fields_under_root: true
  ignore_older: 24h
  document_type: daily_reports
  tail_files: false

Many thanks for help and + 100 to Karma )

I find out that Filebeat is passing "source" field. But I'm not able to get in in Logstash or Elasticsearch.

I have the following architecture:

FilbeBeat -> Logstash(shipper) -> RabbitMQ -> Logstash(processor) -> Elasticsearch

what you mean by, you don't get the source in Logstash? Is your Logstash shipper filtering/removing some event fields?

I mean, that In Elasticsearch the source field does not contain "path" for source file.
I did not apply any parsing, just sending message directly to Elasticsearch through FilbeBeat -> Logstash(shipper) -> RabbitMQ -> Logstash(processor) -> Elasticsearch pipeline

We store the file path in the source field. Perhaps the fields_under_root option is causing that field to be overwritten? Just a guess..

Thanks for the advise.
I have tried to set it "false", but I got the same result.

did you try to debug in logstash with stdout output? Can you share your full filebeat config? Can you share your logstash config? It's the first time I hear the source field being removed... which normally is only the case with filters.

Smart People on ElasticON conference show me the problem and a fix.
The root cause is that I'm sending to Message Queue plain text, but I should use json.

Thanks you, Smart Guys.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.