Need Direction on how to proceed forward with Filebeat, Logstash and Kibana

Hi, I am a newbie on ELK. I am trying to get all my logs in formatted form in Kibana. For this, I had stored some sample logs in Local Directory (C:\logs). I had setup Elasticsearch, Logstash and Kibana on the same machine (since I am in testing stage). I also setup a filebeat on the same machine too. Now I am able to fetch data from the logs in Kibana with filebeat-* as index. But the default output of Kibana is not what I wanted, I need some customized output there. I am lost here and need direction on how to go ahead on these following questions.
I want to create index and the fields by myself (default one is filebeat-*), where do I need to make changes, is it logstash.json, or filebeat.yml?
I need to run the elasticsearch queries on those data retrieved from filebeat, what is the best way of doing that?

My log looks like this:
2017-05-07 20:03:31.8752: ZAP (Id = ZAA-22, EP = 1.1.1.1) - Fatal error.
Something Occured

Desired Output:
TimeStamp: 2017-05-07 20:03:31.8752
Type: ZAP
Id: ZAA-22
HostIP: 1.1.1.1
Error: Fatal error
Message: Something Occured

This looks like a generic kind of question, but I think lot of Starters like me will be benefited from this.
Thank you in advance.

You should have a look at the Logstash Getting Started Guide.

You'll read the log lines with Filebeat and ship them to Logstash. Then in Logstash you can use a series of filters, including grok and date, to parse and enrich the data from your logs and create the desired fields. Then ship the data to Elasticsearch.

1 Like

This topic was automatically closed after 21 days. New replies are no longer allowed.