How to configure Filebeat.yml to listen logs via TCP

Hello,

I have configured the Logstash to listen logs from application via TCP using logback. the Same i have to do for filebeat where filebeat should listen the logs via TCP and send to logstash.
Could anyone help on this?
Posted below the configuration which i made for logstash to listen logs from TCP from application via TCP.

Logback(in application):

<?xml version="1.0" encoding="UTF-8"?> 127.0.0.1:4560

Logstah.yml

input {

tcp {
port => 4560
}

}

filter {
date {
match => [ "timeMillis", "UNIX_MS" ]
}
}

output{
elasticsearch {
hosts => ["localhost:9200"]
index => "app-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"

}
stdout { codec => rubydebug }
}

Why do you want filebeat to listen for TCP? Filebeat is used to read files and send them to elasticsearch or logstash or somewhere else. There are lots of blogs on using filebeat to read files and send them. Just do a quick Google search for it.

Thanks for an quick reply. If we specify the path in file beat configuration its works.
Am new to ELK, may be my question was wrong. Ok here, let me tell you that , I do not want to specify the path as static, i need to collect logs from multiple server dynamically and sends to logstash. Do you have any idea on this?

My scenario,

I have Multiple server name it server-1, 2, 3, 4 ...

In server-1 I have installed the ELK and filebeat. And remaining server which is server-2,3,4.. where my application runs and generating logs. So now i need to collect all these logs from server-2,3,4.. via server-1 filebeat configuration and send to logstash. Could anyone give a ideas on it how to implement.

Thanks!

Why via server 1? You need to install filebeat or logstash on every node you want to collect logs from. Or use syslog if that is an option, to forward to a logstash node. Have you read any of the blogs on log management using ELK?

Sorry for delay.
Yes I red about it. but everyone saying that need to specify the log filr path in filebeat configuration, But that's not needed for me. is there any other way ? Or if you have any blog link about kindly share it .Thanks.

Its not needed for you?? How else is filebeat to know which files to forward???

okay. Please gimme a solution for my above scenario. If possible post a sample configuration for logstash ans filebeat.

You just said it worked when you configured the path. So then do that on all the nodes.

That's what i said that i don't want to specify a path. Do we have alternative any like TCP?

I don't want to specify a location of file man. See , if we are in localhost. ok. we can see the file location and we can specify. suppose for an example when we change the location of log file , we need to change the filebeat or logstash config file to update the path every time. In prod env we can't go each n every time to update the path of file in case of changing the file location frequently. Do you have any better solution for this?

In recent versions, 6.3 onwards if I recall correctly, Filebeat supports a TCP input which I guess you could use.

HI,
Do you have sample config for that? If so could you please post here. Thanks

Have a look at the docs. I have myself never used it.

So you cannot configure path, but having to change the application to log to a tcp recipient is OK? If so why not let the app send the logs straight to a logstash server?

HI

I have configured filebeat to listen logs via TCP. Thanks.

One potential issue to be aware of when logging via TCP from an application is that any back-pressure applied through the pipeline could cause Filebeat/Logstash to stop reading, which could in turn have an impact on the application itself.

ohh. . Then what we can do for this kind of problem?

You can configure Filebeat to spool data to disk, which can handle temporary problems and provide less risk of affecting the application.

ok. Let me give a try.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.