Hi,
my goal is to load Tshark logs into Elasticsearch using Filebeat. So here is my test file (for some lines port column is empty):
2024-11-21 19:29:28 192.168.1.144 142.250.186.196 52873\443
2024-11-21 19:29:28 142.250.186.196 192.168.1.144 443\52873
2024-11-21 19:29:28 192.168.1.185 224.0.0.7
2024-11-21 19:29:28 192.168.1.140 239.255.255.250
My filebeat config:
filebeat.inputs:
- type: log
enabled: true
paths:
- "G:/xx/test.csv"
processors:
- decode_csv_fields:
fields:
message: "message"
separator: "\t"
ignore_missing: false
overwrite_keys: true
trim_leading_space: true
fail_on_error: true
- extract_array:
field: message
mappings:
created_at: 0
ip_src: 1
ip_dst: 2
port: 3
- drop_fields:
fields: [ "message" ]
- drop_event:
when:
equals:
ip_src: ""
output.elasticsearch:
hosts: ["http://localhost:9200"]
username: "elastic"
password: "elastic"
index: "packets-%{+yyyy.MM.dd}"
scan_frequency: 5s
data_stream.enabled: true
setup.template.enabled: true
setup.template.name: "wireshark_template"
setup.template.pattern: "wireshark_template*"
setup.template.overwrite: true
logging:
level: info #(debug, info, warn, error)
to_files: true
files:
path: "G:/xx/logtxt"
name: filebeat
keepfiles: 7 # No of logs
And my template
PUT _index_template/wireshark_template
{
"index_patterns": ["packets-*"],
"data_stream": {},
"template": {
"mappings": {
"properties": {
"@timestamp": {
"type": "date"
},
"packet_date": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss"
},
"ip_src": {
"type": "ip"
},
"ip_dst": {
"type": "ip"
},
"port": {
"type": "text"
}
}
}
}
}
OK, let's start it:
filebeat.exe -c filebeat.yml
It creates no indices in Elasticsearch=>Indices.
When I enter Stack Management=>Index Management=>Data streams and click on it then on 'Indices' on left pane I see '.ds-packets-2024.11.22-2024.11.22-000001'.
But wen I 'discover index' I see only some trash in documents created like 'agent.id', 'agent.type'...'log.offset'. All my fields like 'ip_src' or 'port' are empty.
I see nothing wrong in log file.
What I'm doing wrong?