Basic filebeats => logstash => elastic setup

I'm trying to debug / check if I got everything right with my logstash setup

History:
The logs fed from Filebeats to ES work perfectly
Only downside is that the geoip information is not set

I tried to add logstash in the middle. The setup works, but now I get very limited logs instead of the original format and the dashboards don't work.

I figured if I start with an empty filter, this should work and then I can add additional fields incrementally, but apparently something doesn't work...

My config file for logstash is:

input {
beats {
port => 5400
}
}

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "filebeat-%{+YYYY.MM.dd}"
document_type => "doc"
}
stdout { codec => rubydebug }
}

End goal: Have the exact same output format in ES as if logstash was not in the middle
End goal2: Add a filter to enrich output

Problem: The data fed by logstash is very limited (see below):
{
"_index": "filebeat-2018.09.20",
"_type": "doc",
"_id": "PaWo92UBhe-uS7seR1Tz",
"_version": 1,
"_score": null,
"_source": {
"input": {
"type": "log"
},
"source": "C:\nginx-1.15.2\logs\test_PROJ_DEV_access.log",
"message": "127.0.0.1 - - [20/Sep/2018:16:57:46 +0200] "POST /api/test_setup/FindAll HTTP/1.1" 200 6888 "http://localhost:6600/test_project/search-item" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.117 Safari/537.36"",
"offset": 70812,
"beat": {
"name": "ME_PC",
"hostname": "ME_PC",
"version": "6.4.1"
},
"host": {
"name": "ME_PC"
},
"@timestamp": "2018-09-20T15:44:59.479Z",
"tags": [
"beats_input_codec_plain_applied"
],
"prospector": {
"type": "log"
},
"@version": "1"
},
"fields": {
"@timestamp": [
"2018-09-20T15:44:59.479Z"
]
},
"sort": [
1537458299479
]
}

Logstash has no built-in support for HTTP access logs. See https://www.elastic.co/guide/en/logstash/current/config-examples.html for an example of how to configure it by hand.

1 Like

Filebeat modules do the parsing through an ingest pipeline in Elasticsearch. Now that you are no longer writing directly into Elasticsearch, this pipeline is not called. The Logstash Elasticsearch output plugin supports specifying a pipeline, so that is what probably need to change.

I would suspect this to be documented somewhere, so let me look around. How to replicate the Filebeat module processing in Logstash seems to be documented here.

1 Like

Thanks Christian. I would've never found that in a million years :smiley:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.