Greetings,
I'm trying to use filebeat to ingest a log file full of JSON objects. I've gotten it to work and it will ingest the data and I can discover the data in Kibana almost correctly. My problem is that the data that's shown is not of the right data types.
For example, some of the JSON elements are IP addresses. When I use filebeat to parse the JSON record using -type: log
+ json.keys_under_root: true
(in filebeat.yml), they become strings.
I have a JSON template file that I had used to create a template via the ElasticSearch API (PUT /_templates/elk-dev) but filebeat seems to ignore it.
Does anyone have any insight into this?
Thank you in advance!
Edit: more info
Relevant bits of my filebeat.yml:
filebeat.prospectors:
- type: log
paths:
- /var/log/elk-dev/*
json.keys_under_root: false
json.add_error_key: true
document_type: json
fields:
type: custom_json
codec: json
Relevant bits of my fields.yml:
- key: log
title: Log file content
description: >
Contains log file lines.
fields:
- name: log.source
type: keyword
required: true
description: >
The file from which the line was read. This field contains the absolute path to the file.
For example: `/var/log/system.log`.
- name: log.offset
type: long
required: false
description: >
The file offset the reported line starts at.
- name: log.message
type: text
ignore_above: 0
required: true
description: >
The content of the line read from the log file.
fields:
- name: destination-ip
type: ip
required: false
description: >
IP address
- name: destination-ip
type: ip
required: false
description: >
IP address
(the destination-ip field exists in two levels because I didn't know which one would work so I tested both at the same time. I would delete one or the other if it had worked but it did not)
Here's a log line example:
{"event-second": 1321132924, "event-microsecond": 383996, "signature-id": 1, "priority": 1, "sport-itype": 52378, "dport-icode": 80, "protocol": 6, "vlan-id": 0, "source-ip": "10.0.1.6", "destination-ip": "8.8.8.8", "length": 382}