Parse the logs with FileBeat-ElasticSearch and Display on Kibana

I'm very new to the forums of Elastic. I have gone through these tools and few videos some hours ago and setup Kibana, Logstash and Elasticsearch on my local windows machine (on a private network).

I managed to parse the lines from command line.
Setup used was ==> logstash --> elastic search --> Kibana.

I need some help to Parse the logs with FileBeat and elastic search then display on Kibana with different filters and statistics. Currently, I don't want to parse using logstash. Simply, want to display the log file in Kibana after processing through FIleBeats --> ElasticSearch --> Kibana.

Can you please help me with config files for the same?

You need to configure Elasticsearch output for Filebeat:

Do you have any specific question regarding the configuration?



- input_type: log

    - C:\Users\gadeshr\elastic\logs\*.log

  # Array of hosts to connect to.
  hosts: ["localhost:9200"]

Running filebeat on windows as --> filebeat -e -c filebeat.yml -d "publish"
Elastic search is running on the default port 9200.
I can't see the files being processed and displayed in Kibana.

The path C:\Users\gadeshr\elastic\logs\*.log contains log files with extension .log

Sample Log content: - [10/Nov/2017:10:00:36 +0100] "GET /CONTENT/images/recycle.gif HTTP/1.1" 200 617 - [10/Nov/2017:10:00:36 +0100] "GET /CONTENT/images/info.gif HTTP/1.1" 200 953 - [10/Nov/2017:10:00:36 +0100] "GET /CONTENT/de/img/tabButton50.gif HTTP/1.1" 200 6863 - - [10/Nov/2017:10:00:49 +0100] "POST /CONTENT/HTTP/1.1" 200 40554 - - [10/Nov/2017:10:00:50 +0100] "POST /CONTENT/HTTP/1.1" 200 68016 - - [10/Nov/2017:10:00:49 +0100] "POST /CONTENT/HTTP/1.1" 200 316647 - - [10/Nov/2017:10:00:52 +0100] "POST /CONTENT/HTTP/1.1" 200 88765

Also, please help me clean up Kibana. I tried many things from command line earlier and messed up Kibana with lot of text lines.

FileBeat Logs:

2017/11/16 08:07:29.812643 spooler.go:63: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2017/11/16 08:07:29.840643 log.go:91: INFO Harvester started for file: C:\Users\gadeshr\elastic\logs\Sample.log
2017/11/16 08:07:29.812643 sync.go:41: INFO Start sending events to output
2017/11/16 08:07:34.905643 client.go:667: INFO Connected to Elasticsearch version 5.6.4
2017/11/16 08:07:34.906643 output.go:317: INFO Trying to load template for client: http://localhost:9200
2017/11/16 08:07:34.918643 output.go:341: INFO Template already exists and will not be overwritten.
2017/11/16 08:07:59.746643 metrics.go:39: INFO Non-zero metrics in the last 30s: filebeat.harvester.open_files=2 filebeat.harvester.running=2 filebeat.harvester.started=2
nd_acked_events=1 libbeat.publisher.published_events=1 registrar.states.current=3 registrar.states.update=5 registrar.writes=1
2017/11/16 08:08:29.746643 metrics.go:34: INFO No non-zero metrics in the last 30s

Elastic Search Logs:

[2017-11-16T13:33:11,524][INFO ][o.e.n.Node ] [0oD1ygx] starting ...
[2017-11-16T13:33:12,528][INFO ][o.e.t.TransportService ] [0oD1ygx] publish_address {}, bound_addresses {}, {[::1]:9300}
[2017-11-16T13:33:16,058][INFO ][o.e.c.s.ClusterService ] [0oD1ygx] new_master {0oD1ygx}{0oD1ygxfRS-n_6IuXYbPZg}{EyGm7v2ARlKrr1KiTUhgZg}{}{}, reason: zen-disco-elected-as-master ([0] nodes joined)
[2017-11-16T13:33:16,492][INFO ][o.e.g.GatewayService ] [0oD1ygx] recovered [4] indices into cluster_state
[2017-11-16T13:33:17,444][INFO ][o.e.h.n.Netty4HttpServerTransport] [0oD1ygx] publish_address {}, bound_addresses {}, {[::1]:9200}
[2017-11-16T13:33:17,444][INFO ][o.e.n.Node ] [0oD1ygx] started
[2017-11-16T13:33:17,500][INFO ][o.e.c.r.a.AllocationService] [0oD1ygx] Cluster health status changed from [RED] to [YELLOW] (reason: [shards started [[.kibana][0]] ...]).
[2017-11-16T13:37:34,944][INFO ][o.e.c.m.MetaDataCreateIndexService] [0oD1ygx] [filebeat-2017.11.16] creating index, cause [auto(bulk api)], templates [filebeat], shards [5]/[1], mappings [default]
[2017-11-16T13:37:35,295][INFO ][o.e.c.m.MetaDataMappingService] [0oD1ygx] [filebeat-2017.11.16/lDi2N3LNQESM5Wk9AshzeQ] create_mapping [doc]

How can I check the content in Kibana?
I cannot see any log content displayed in Kibana.

have you checked the elastic search indexes?

Please check this and show the output.


health status index               uuid                   pri rep docs.count docs.deleted store.size
yellow open   logstash-2017.11.15 06UitZN_TK6GrIuXIkMQxw   5   1          5            0     37.5kb         37.5kb
yellow open   .kibana             0ykkKHGdRgWI6rQPOzZJog   1   1          3            0     25.4kb         25.4kb
yellow open   logstash-2017.11.14 IqE7GqoYS3uzJ0b3yVIqaw   5   1         18            0     86.3kb         86.3kb
yellow open   filebeat-2017.11.16 lDi2N3LNQESM5Wk9AshzeQ   5   1          1            0      7.4kb          7.4kb
yellow open   filebeat-2017.11.15 bzLMP3KfR1KCVupO7bbTBQ   5   1        107            0     90.5kb         90.5kb

How do I see it in Kibana?

Now, I have added the index 'filebeat-*' in Kibana and able to see the data loaded there.

Thanks for the help !

@Shreyas_Gade You are welcome. Glad you able to get the data, :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.