I wan to parse logfile(which is at client side) and and display content to kibana(which is hosted in offshore).
I am planning to use logstash at client side and elastic search and kibana both locally.
I had heard about filebeat also.
I want to know,
whether I should use filebeat at client side and logstash, elasticsearch, kibana locally? or
logstash at client side and elasticsearch and kibana locally? or
can I use only filebeat, elasticsearch and kibana and exclude logstash?
can I write grok filter and ruby function in filebeat and push data to directly elasticsearch? or
I should use filebeat only to read the file and logstash for parsing the data(grok and ruby)?
If you want to use grok filters and ruby functions you will need to use logstash, filebeat do not parse the data, you also do not need filebeat to read a logfile if you are able to install logstash in the client machine.
By offshore do you mean something like an Oil Rig? Dont forget to pay attention to your link bandwith and the volume of data that you are sending, using logstash with an output to elasticsearch you can enable the http compression which helps a lot.
I have a similar use case where I send logs to a logstash instance on my local infrastructure, parse the data with some grok filters and then output the result to an elasticsearch instance in a cloud service, I do that mostly to save bandwith between my local infrastructure and the cloud service.
In your example I would use elasticsearch and kibana in your offshore host and logstash to read the logs in your client side.
Thank you Leandro for your answer!
But this is 24*7 running application log and it will have so much data to parse and send from client side to local(offshore) system every second. And we might not get access to client directory after fresh installation and setup is once done to update config file in the future.
In this case, is there any tool which only will read the log data from client side and send it to logstash and then logstash manipulation can be done in local system so that we can customize everything including grok filters.
Well, in this case you can use Filebeat to read the log and send the data to logstash and then parse the data before sending to elasticsearch.
So, in the client side you will need Filebeat configured to read the log file or the log directory and send the output to logstash in your offshore infrastructure and after that you can parse the data and send it to elasticsearch and display it in kibana.
after deleting registry data, I ran below command:
.\filebeat -e -c filebeat.yml -d "publish"
but I am getting error:
output.go:109: DBG output worker: publish 18 events
single.go:140: ERR Connecting error publishing events
(retrying): dial tcp [::1]:5044: connectex: No connection could be made because
the target machine actively refused it.
metrics.go:34: INFO No non-zero metrics in the last 30s
I tried to telnet logstash
telnet localhost 5044
Connecting To localhost...Could not open connection to the host, on port 5044: Connect failed
I checked in my services, I can see logstash service is already started.
Not sure what to do now.
I am using windows system.
Please help!
The port in the logstash input needs to be the same one you use in the filebeat output, right now your logstash is waiting for beats input in the port 5043, but your filebeat is trying to send the data to the port 5044.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.