I am trying to setup an Elasticsearch index to display log contents from an application using Kibana. I setup the Filebeat on my application server and it created an index with 25 fields in Elasticsearch. Please note the application log is not any of the default log types available via the Filebeat.
Data flow is like this:
Filebeat>> Elasticsearch >> Kibana.
Problem I am having is the contents of the log files are not being included in the Elasticsearch index it created, as a field for me to display via Kibana. It has the field "count of records" and it shows different numbers for each of the log files. So I am assuming the Elasticsearch Index is aware of what is inside, each of the log files in terms of number of rows, but haven't created a field for it for some reason.
It is somewhat similar to the below issue, but not exactly the same.
I tried to edit the original post instead of replying to it.
Couple of things have changed.
It seems if I use the "Discover" option in ELK and look for the index and it's fields, the field with the message details appear there.. As originally posted, this isn't the case when trying to create Visualization using the same index, where this field called "message" is invisible.
The second thing I want to ask is the following.
I have the log file with two prominent fields as below.
####2021-12-21 11:10:10,820 ThreadId:99 INFO irpuyc_Impl - -> Request() <LogContext:Facade> <[ACTIVE] ExecuteThread: '64' for queue: 'weblogic.kernel'> ####2021-12-21 11:10:13,830 ThreadId:99 ERROR subsystem.api.subSystemFuture - error took place <LogContext:none> <[ACTIVE] ExecuteThread: '64' for queue: 'weblogic.kernel.Default (self-tuning)'>
Based on thread id which is Thread:99 in this case, the ERROR and INFO lines are related via Thread:99. The number 99 keeps changing. Is there a way for me to find all the related INFO lines based on TreadId, if the filebeat finds an ERROR line in the logs?
Thank you for any feedback on this. Much appreciated.