Hi folks,
I'm going to include our Akamai Logfiles in our Elastic Stack. I'm syncing the compressed logs via rsync to a local storage. There are only compressed (gzip) files. As I read, there were some issues using Logstash and gzip files. My filebeat is harvesting the Files properly and sending it to Logstash. But I have some trouble viewing the lines in Kibana.
First of all: my configuration:
#####################################
cat 02-beats-input.conf
input {
beats {
port => 5044
client_inactivity_timeout => 3600
}
}
#####################################
cat 15-akamai-cdn.conf
filter {
if [type] == "akamai-cdn" {
grok {
match => { "message" => "%{DATE_EU:date}\t%{TIME:time}\s%{IP:clientip}\s%{WORD:httpmethod}\t%{PATH:requestedpage}\s%{NUMBER:responsecode}\s%{NUMBER:bytessent}\s%{NUMBER:timetaken}\t%{DATA:csreferrer}\t%{DATA:csuseragent}\t%{DATA:cscookie}" }
add_tag => ["akamai-cdn"]
}
}
}
#####################################
cat 30-elasticsearch-output.conf
output {
elasticsearch {
hosts => ["elasticsearch.xxxxxxxxxxxxxt:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
#####################################
This is working fine so far, but the logfile of filebeat contains the following:
2017-12-08T09:50:44+01:00 DBG Publish: {
"@timestamp": "2017-12-08T08:50:43.026Z",
"beat": {
"hostname": "logstash",
"name": "logstash",
"version": "5.6.5"
},
"fields": {
"env": "staging"
},
"input_type": "log",
"message": "{p\ufffd\u000c\ufffd\u0011q\ufffd\u0001\ufffd^p\ufffd\ufffdD.\u001du\u001b\ufffd#"\ufffdH\ufffd\u0007.",
"offset": 1350806,
"source": "/data/logs/akamai-cdn/pictures1_xxxxxxxx_de_xxxxx5.esw3c_S.201711110000-2400-22.gz",
"tags": [
"akamai-cdn",
"akamai-filebeat"
],
"type": "log"
}
and if you look at this in Kibana it shows the following "message" :
Does anybody have a solution or an idea to solve this ?
best regards
Hans