Problem when sending info from Filebeat -> logstash -> Elasticsearch ->kibana in windows

Hello everyone,

I hope somebody could help me. Recently, I'm learning how to use filebeat and I'm trying to use it for making the streaming of a log in the most basic way in Windows. I have a the log file located in the following path ( C:\Users\Juan-David\Desktop\Grenoble INP\Semester 4\Internship-documents\S2 -Logstash_files_importing\Filebeat_logs\acces.log) and I have download and installed Filebeat in my computer in the path (C:\Program Files\Filebeat).

The problem : I run the command "filebeat -e" in the powershell, and also in another cmd prompt window a conf file in the logstash bin with the comand (logstash -f acces.conf) to ingest a bit this document with logstash before sending to elasticsearch.

this is the pipepline associated:

input{
beats{
port => 5044
host => "0.0.0.0"
}
}

filter{
grok
{
match => {"message" => "%{GREEDYDATA:text1}"}
}
}

output{
elasticsearch {
hosts => ["localhost:9200"]
index => "access_index"
document_type => "default"
}
}

And i'm trying to read a file while lines like this:
4.252.108.229 - - [20/Sep/2017:13:22:22 +0200] "GET /products/view/123 HTTP/1.1" 200 12798 "https://codingexplained.com/products" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.90 Safari/537.36"

Then when the procedure finished, and I open kibana to see the results, something like this appear,

. having an index with 0 rows called filebeat -7.6.2.2020.04.06-00001.

Why it is not showing me the 35 rows that composed the "acces.log" log file?

How could i make sure in the filebeat.yml that first the logstash output is done and after the elasticsearch output?

Why the index is not having the name that I gave in the pipeline of logstash "acces_index"?

Thank so much, I going to be very grateful if you cant help me

Cordially,
Juan David Briceno Guerrero

Hi @nb03briceno,

In your screenshot I see that you have an access_index with 34 documents. This is the index name you are setting in logstash, is this the index you are looking for?
If it is not capturing the 35 lines it may be because the last line of the file doesn't have a new line.

Indexes like filebeat -7.6.2.2020.04.06-00001 are created with the default configuration when configuring filebeat to send data directly to Elasticsearch, without using Logstash. Is it possible that you configured filebeat this way at some moment?

Thanks so much jsoriano, i think the problem was only the syntaxis I was using with the filebeat.yml. Thanks so much for trying to attend to my question.

JUAN DAVID BRICENO GUERRERO
MASTER STUDENT IN SUSTAINABLE INDUSTRIAL ENGINEERING

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.