Hi,
I want harvest Nginx
access log using Filebeat
. Instead of output to Logstash
/ Elasticsearch
, I just want to send log content to Kafka
topic. The content may like:
{"foo":"xxx","bar":"yyy"}
Then I config filebeat.yml
input
:
- type: log
enabled: true
paths:
- /data/log/nginx/access/overwrite_test/change_root_field_input.log
json.keys_under_root: true
json.overwrite_keys: true
json.add_error_key: true
fields:
otype: change_root_field_input
stattype: json_log
However, when I fetch log and feed to console / Kafka, the output
likes below:
{
"@timestamp": "2019-08-23T07:24:53.905Z",
"@metadata": {
"beat": "filebeat",
"type": "_doc",
"version": "7.3.0"
},
"bar": "yyy",
"fields": {
"stattype": "json_log"
},
"input": {
"type": "log"
},
"agent": {
"ephemeral_id": "95a82ced-972a-464b-b985-f3cda3c92681",
"hostname": "server_1",
"id": "1f1321f7-098d-4da9-a2df-62e21afb7911",
"version": "7.3.0",
"type": "filebeat"
},
"ecs": {
"version": "1.0.1"
},
"host": {
"architecture": "x86_64",
"os": {
"platform": "centos",
"version": "6.10 (Final)",
"family": "redhat",
"name": "CentOS",
"kernel": "2.6.32-696.10.1.el6.x86_64",
"codename": "Final"
},
"containerized": false,
"name": "server_1",
"hostname": "server_1"
},
"foo": "xxx",
"log": {
"offset": 645,
"file": {
"path": "/data/log/nginx/access/my_custom.log"
}
}
}
I just want fields like foo
and bar
from nginx log, what's more, my custom field stattype
. Unfortunately, I got so many fields I don't care and won't use. Because it won't send output to ElasticSearch
and Logstash
.
So, can I only ouput my custom fields wihthout default fields
? I have tried Drop fields from events, but it seems only drop log
fieds and had no idea about other .
Thanks