How to add my network logs of my applications to elastic/observability

I would like to access my logs and create a dashboard in Elasticsearch/observability. Please let me know how to integrate the application logs from the network. I would like to know the steps.

Hi @schavaku,
Without any specifics on the types of logs and dashboards you want all I can give you is a general answer.

  1. Install logstash
  2. Configure logstash to process your log files and send them to elastic.
  3. Create visualizations for your dashboard in kibana using Lens and then save them to a new dashboard.

Thank you so much. I am using Filebeat instead of logstash to get the logs. I was able to get them and is able to create a dashboard. I am able to get the log file as a whole but I am not sure if I can further read the log file fields so that I can create a dashboard in a meaningful way.

The other question I have is what is the difference between Filebeat and Logstash and in what circumstances we use one over another

You might want to try using the filebeat processor dissect to further break out your file into meaningful fields.

Filebeat is great for processing files, especially if the files are some of the supported modules.

Logstash is the original data manipulation/ingestion/transform program used by the ELK stack (the L in this case). Think of it like a swiss-army knife. It can pretty much do it all, but can be more complicated to use. Really depends on the use case though.

Thank you.

My file pattern looks like this

\server\h******\Log\file1
\server\h***
*****\Log\file2

each file has some codes and failures and success etc

I will use the dissect as you suggest but I am wondering if I can use the dissect settings in file beat yml directly

Also, I have one more question:

Can I fire an email based on a code that I see in my log file? If so where can I setup the rules for filebeat?

Thanks for answering my questions

Yes I think you can just add a processor section as needed (for example).

I don't think filebeat can but Logstash can. Basically anything filebeat can do so can logstash and then some. Logstash has an output plugin that can send email.
Really briefly logstash has three main configuration parts: inputs->filters->outputs. Filters are optional.
Good luck!

Thank you.

I have a folder with all the log files coming everyday.

I want to dissect the latest file

I used the following dissect
processors:

  • dissect:
    tokenizer: '%{log-level} | %{date-time} | %{exit-code} | %{server-hostname} | %{log-path}

but I am getting a parsing error

dissect_parsing_error

Pls let me know that what goes wrong from my side. Thank you

Sure, can you provide a few lines of the file in question to see what's going on. Please remember to redact any sensitive or confidential information.

INFO | 03/29/2023_05:19:39 | Exit Code: 0 | | BatchJob.Run
INFO | 03/29/2023_05:19:39 | ********** complete. | | JobEngine.Common.Client.LogToFile
INFO | 03/29/2023_05:19:39 | **********|Run complete. | | Batch.JobEngine.Common.ApiClient.LogToFile
INFO | 03/29/2023_05:19:39 | **********| ***** Job Results ***** | Schedule ID:3657 | Exited with Exit Code: 0. Exit Message: . Update Last Run Date: False. Last Run Date: 3/29/2023 5:19:39 AM | | Batch.JobEngine.Common.Client.LogToFile
INFO | 03/29/2023_05:19:39 | **********|Skipping LastRunDate update as the BatchJobResult.UpdateLastRunDate was set to 'false' and/or an error occurred for ScheduleId: 3657. UpdateLastRunDate: False, Error Occurred: False | | Batch.JobEngine.Common.ApiClient.LogToFile

Not sure, the issue could be there are 6 pipes for some data and 5 pipes for some other data. Thanks for looking into this.

I was able to parse it. Thank you

1 Like

Oh nice, I was just going to take a swing at it. Do you have anything to share in case someone else runs into a similar issue? Not required of course, but it could be your first solution. :slight_smile:

I used the following and it worked. Thank you.

  tokenizer: '"%{log-level} | %{date-time} | %{exit-code} | %{server-hostname} | %{log-path}"'
  field: "message"
  target_prefix: ""

I have the following question now.

How can I create a dashboard using the message of this log file?. This message consists 5 fields. But I would like to get one or two of these fields and show them in the dashboard.

Sure thing. Creating a new dashboard is pretty easy and this documentation should help you get started.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.