Best practice

I wan to parse logfile(which is at client side) and and display content to kibana(which is hosted in offshore).
I am planning to use logstash at client side and elastic search and kibana both locally.
I had heard about filebeat also.
I want to know,

  1. whether I should use filebeat at client side and logstash, elasticsearch, kibana locally? or
  2. logstash at client side and elasticsearch and kibana locally? or
  3. can I use only filebeat, elasticsearch and kibana and exclude logstash?
    can I write grok filter and ruby function in filebeat and push data to directly elasticsearch? or
    I should use filebeat only to read the file and logstash for parsing the data(grok and ruby)?

What is the best practice to implement this?

If you want to use grok filters and ruby functions you will need to use logstash, filebeat do not parse the data, you also do not need filebeat to read a logfile if you are able to install logstash in the client machine.

By offshore do you mean something like an Oil Rig? Dont forget to pay attention to your link bandwith and the volume of data that you are sending, using logstash with an output to elasticsearch you can enable the http compression which helps a lot.

I have a similar use case where I send logs to a logstash instance on my local infrastructure, parse the data with some grok filters and then output the result to an elasticsearch instance in a cloud service, I do that mostly to save bandwith between my local infrastructure and the cloud service.

In your example I would use elasticsearch and kibana in your offshore host and logstash to read the logs in your client side.

1 Like

Thank you Leandro for your answer!
But this is 24*7 running application log and it will have so much data to parse and send from client side to local(offshore) system every second. And we might not get access to client directory after fresh installation and setup is once done to update config file in the future.
In this case, is there any tool which only will read the log data from client side and send it to logstash and then logstash manipulation can be done in local system so that we can customize everything including grok filters.

Well, in this case you can use Filebeat to read the log and send the data to logstash and then parse the data before sending to Elasticsearch.

So, in the client side you will need Filebeat configured to read the log file or the log directory and send the output to logstash in your offshore infrastructure and after that you can parse the data and send it to Elasticsearch and display it in kibana.

Thakns @leandrojmp.
Sorry to interrupt you but I am new to this filebeat.
I have configured filebeat.yml file

- input_type: log
    - C:/xxx.log

  hosts: ["localhost:5044"]

I changes logstash config file also:

		port => "5043"
		hosts => ["localhost:9200"]
		user => elastic
		password => changeme
		index => "xxx"
		codec => rubydebug 

after deleting registry data, I ran below command:

.\filebeat -e -c filebeat.yml -d "publish"

but I am getting error:

output.go:109: DBG  output worker: publish 18 events
single.go:140: ERR Connecting error publishing events
 (retrying): dial tcp [::1]:5044: connectex: No connection could be made because
 the target machine actively refused it.
metrics.go:34: INFO No non-zero metrics in the last 30s

I tried to telnet logstash

telnet localhost 5044
Connecting To localhost...Could not open connection to the host, on port 5044: Connect failed

I checked in my services, I can see logstash service is already started.
Not sure what to do now.
I am using windows system.
Please help!

Should this not try to connect to port 5044?

The port in the logstash input needs to be the same one you use in the filebeat output, right now your logstash is waiting for beats input in the port 5043, but your filebeat is trying to send the data to the port 5044.

@Christian_Dahlqvist and @leandrojmp thank you very much!! it worked :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.