Kibana, elasticsearch and filebeat. Do i need logstash?


I'm a compleet n00b concerning the ELK stack. I'm a network engineer and doing some research about logging and analytics.

I wanna do 3 things.

Grab log out of /var/log on the local machine. Grab log via filebeat on remote machines. Collect syslog thats being send to the ELK stack.

Second: Pull data in via vendors API (represented in json).

Now my question is what to use. I red alot about the ELK stack so basic knowledge (informational) is available. Only thing i'm not sure of is should i use the logstash or inject data from filebeat directly in elasticsearch.

To put data from vendor API in ELK do i need the logstash ?



1 Like

The value Logstash provides is the ability to parse/normalize the incoming log stream. For instance, if you have log data comin in via Filebeat from both apache and nginx you could use Logstash to parse both into a "common" format before storing it in Elasticsearch.

Logstash can also provide persistence which helps you not lose data in situations where your Elasticsearch cluster "blips" or isn't able to handle a load peak.

You don't need Logstash to scrape an api and put the data in Elasticsearch, you could for instance use an aws lambda function or similar for that.

Thanks @trondhindenes that makes sense. I'm running splunk, but costs are pretty high. Thats why i'm checking out ELK. Thanks for your help.

The first thing is we’ve renamed ELK to the Elastic Stack, otherwise Beats and APM feel left out! :wink:

However you can do some/most/all of what you want with beats. Metricbeat can poll APIs for eg, but it is limited in what it can do. It can also ship files but it cannot pattern match or normalise.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.