Filebeat sends json to kafka instead of raw content of log file

Hi , i have installed filebeat but i have problem.
I need to send only raw content of log file to kafka/logstash without additional information from beats. How to do that?

filebeat config file

- input_type: log
    - /var/log/nginx/api.access.log
  document_type: api_event_beats
  # initial brokers for reading cluster metadata
  hosts: ["ny-cluster1:9092", "ny-cluster3:9092", "ny-cluster5:9092", "ny-cluster7:9092", "ny-cluster9:9092"]
  # message topic selection + partitioning
  topic: 'api_events_beats'
    reachable_only: false
  required_acks: 1
  compression: 'none'
  max_message_bytes: 1000000
  worker: 8

when i consume topic i see this. I do need json format.

{"@timestamp":"2016-12-14T07:53:41.393Z","beat":{"hostname":"ny-front4.test.loc","name":"ny-front4.test.loc","version":"5.1.1"},"input_type":"log","message":"ny-front4.test.loc::200::2739::0.012::\\u0026type=settings\u0026market=google::application/json; charset=utf-8","offset":2488613629,"source":"/var/log/nginx/api.access.log","type":"api_event_beats"}

but i need to receive this

nginx: ny-front1.test.loc::200::2680::0.000::; charset=utf-8

This is currently not supported. There is an open PR introducing this feature.

Steffen, maybe you now when this feature will be implemented?

I have no idea if and when this feature will be merged. The original author didn't answer the code review yet.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.