ITGuy Chris' Journey into Linux, Elastic and Products

Greetings all,

First, I am very new to this forum and community and fairly new to Linux. I would appreciate some guidance as I find the documentation to be good, but bouncing all over the place until my brain says enough!

I am on my like 11th install of the product, starting at 6.7 and now fully 7.0 with lots of Veeam backups and Vmware Snapshots.

First off, anyone that can assist me in this journey, I have shared my files from the server and the one client below for reading.

The index is:
etc folder : contains Kibana, Logstash, ElasticSearch files
webserver: this is the 1st client i need to get working and understanding (filebeat client)

There are two images in the root:
The first is a Kibana screenshot of the logs, which don't appear to be working consistently from the log file .. but first thing is figuring out why the log is in the future.

The second image is where my pressure is: This is what the COO of my company, who says "this will be easy" wants. I'm not even sure if ELK is the answer, but since I am not allowed to look commercial ... I am getting paid to learn and be frustrated along the way :slight_smile:

So here is where I am:

  • Ubuntu 18.x installed and updated

  • Elastic Search installed on this virtual with no security yet

  • Logstash installed with no security on same box

  • Kibanna installed on same box, no security yet

These appear to be all communicating and I have done simply tests along the way.

What i want to accomplish first:

I have two web-servers that are load balanced. I am working with Web Server 1 at this time. I am trying to take IIS logs for 1 of the sites for now (pscom) and send them to Kibana. I have a Grok filter working and may add additional processing later as I get more comfortable.

The COO wants queries and alerts .. not sure on what, I presume 500 x server messages, he keeps saying 404 errors which is probably not what he wants. The web-guys want it for logging and so forth.

So Web-Server 1 has Filebeat --> Logstash (Grok) and passing to ElasticSearch.
This is "kinda working", meaning I see one entry and the log index management in Kibana show it slightly growing. I am not sure. Hope to see new index tomorrow.

I am in need of some pointers and help along the way.

Somethings I am not sure of so far are:

  • how to have multiple conf files for logstash load.
  • WebServer 1 and WebServer 2 are the same, can these use the same pscom.conf file I have? I assume they make different index files ?
  • I have a ton of pre-made visualizations in Kibana, but they are point to other indices created with the setup. Not sure if I can change for use in the new indices.
  • I want to secure the communications between these clients, but do not need outside access. A little confused on what to change in each program to make things work... let alone doing a self-signed certificate for them in Ubuntu.

Again, appreciate the read and recognize I have a tall order. I just started this job and I sadly come from a Windows world and we are IIS / .NET web dev, but lots of Linux machines and VmWare - two things I learn on the fly. Just hoping to make it past probation on this job!! If I can nail this stuff I hope to be safe.

Thanks for any help

ITGuy - Chris

Why not use IIS module | Filebeat Reference [8.11] | Elastic? Then you don't need to worry about Logstash and you can simply deploy the same config to each webserver.

Unfortunately not easily. You can either rebuild them, or export things and then make changes in the json to give them new names and point to the new indices.

Secure them how - TLS, authentication and access control?

Thanks for the reply. I tried that at first and was not successful. It was my understanding that you could not add additional fields or add do much filtering?

From a true security point of view, yes to all. The last two for sure adds some layer of protection from snooping employees at least.

Just an update, I went back and configured the enable iis module ; shutdown logstash for now and have the beats going directly elasticsearch; I have some logs showing up.

This is a good start and i am not sure if things just worked out in my favor vs. 6.7 where I tried this a few times.

It looks like it already parses most of the fields I require; but wanted to know if custom fields still can be added?

The other challenge, which I feel is something on the Linux box, is times. It looks like UTC is being used. I am not sure where/how to change this.

Kibana: 2019-05-09 10:17:13.000
Log Event: @timestamp 2019-05-09T13:17:13.000Z

ubunutu: timedatectl
Local time: Thu 2019-05-09 10:19:05 ADT
Universal time: Thu 2019-05-09 13:19:05 UTC
RTC time: Thu 2019-05-09 13:19:05
Time zone: America/Halifax (ADT, -0300)

The entire Elastic Stack uses UTC, and it's Kibana that will translate that to local browser time.

You can add some custom fields via beats - https://www.elastic.co/guide/en/beats/filebeat/current/add-fields.html

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.