Harvest + Analyze customers application logs

Hi
I am new with Elastic and Filebeat.
Decided to test harvesting and analyze customers application logs.

  • Installed (Windows) Elastic + Filebeat + Kibana.
  • Configured Filebeat fields.yml to the customized logs fields source:

-- Log row example:

0614_15:00:00.831, "SYSTEM pre-ordering starting, lastRun: <20190613>", [INFO], T@105918, T:ActiveMgr.105918, , com.mm.PreMN, PreMN::orderSystemIntoShadow, ,

-- Fields.yml:

"date":"id": "date"
"error.message":"id": "string"
"process.title":"id": "string"
"event.action":"id": "string"
"process.program":"id": "string"
"event.type":"id": "string"
"event.hash":"id": "string"

Yet, Kibana has created the filebeat new index with the default fields.

Please assist. Thanks!

Could you please share your debug logs (output of ./filebeat -e -d "*") and your configuration formatted using </>?

To me it seems that the format of your fields.yml file is incorrect. This is the fields.yml of HAProxy module in Filebeat: https://github.com/elastic/beats/blob/master/filebeat/module/haproxy/_meta/fields.yml You need something like that.

Hi Noémi,
thanks for responding to my query.

  • The filebeat debugged output does show that most of the input files didn't change, yet it does not show on Kibana as supposed.

  • Filebeat keeps harvesting the Filebeat+Kibana log files.

  • My fields.yml should be quite short, I just couldn't figure out its structure.
    Can you say how to add just the requested fields to this example load:
    "date":"id": "date"
    "error.message":"id": "string"
    "process.title":"id": "string"
    "event.action":"id": "string"
    "process.program":"id": "string"
    "event.type":"id": "string"
    "event.hash":"id": "string"

Thanks.

Do you want to add these fields to the existing fields, or add only these fields without the fields provided by Filebeat?

For now, I need to test only those fields.

Next, will need to add fields per log file type.

Wish Filebeat could analyze such log files pattern.

Thanks.