How to customize a filebeat to parse a log?

I read a the formal docs and wanna build my own filebeat module to parse my log. But there's little essays which could be helpful to me.
For example, my log is :
2020-09-17T15:48:56.998+0800 INFO chain chain/sync.go:705 block validation {"took": 0.111962369, "height": "590", "age": 147656.998959135}.
Is there any essays that I can read to build my own pipeline.yml or pipeline.json to parse this log?
I'm a rookie dev , pls forgive my ignorant.
Thanks.

Hello,

There a 3 ways to do this:

  1. Use the dissect processor directly within FileBeat: https://www.elastic.co/guide/en/beats/filebeat/current/dissect.html
  2. Install LogStash and use a grok pattern: https://www.elastic.co/guide/en/logstash/master/plugins-filters-grok.html
  3. Use an ElasticSearch ingest pipeline https://www.elastic.co/guide/en/elasticsearch/reference/master/grok-processor.html

Best regards
Wolfram

Yeah, I created my own filebeat module and created a fileset. Then in the ingest directory I modified the processors in dissect way, but in kibana Discover found it not worked. In case avoid I write the wrong code, I try to use 'remove' method in the processors but still not worked.
{ "description": "Pipeline for parsing filecoin logdata logs", "processors": [ { "remove": { "field": "ecs" } } ], "on_failure" : [{ "set" : { "field" : "error.message", "value" : "{{ _ingest.on_failure_message }}" } }] }
And I checked the manifest.yml ingest_pipeline: ingest/pipeline.json. It's a right setting. After executing make update, I enable the module in the modules.d directory.
Finally I checked the kibana Discover but found it not worked (I use the updated datas). The 'ecs' field still exist.
Consequently, how can I set my own custom field with the 'source -> message' ?
Best wishes.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.