Custom module pipeline is not working on Kibana

I was trying to generate a custom module, named 'unixhops' fileset 'access'

Steps I followed:

  1. Downloaded github repo
  2. make create-module MODULE={module}
  3. Test pipeline: go run main.go -elasticsearch http://10.0.50.100:9200 -pipeline /root/go/src/github.com/elastic/beats/filebeat/module/unixhops/access/ingest/pipeline.json -logfile test.log --simulate.verbose --verbose it passed.
  4. make create-fields MODULE={module} FILESET={fileset}
  5. make update
    `:~/go/src/github.com/elastic/beats/filebeat# make update
    mage update
    Generated fields.yml for filebeat to /root/go/src/github.com/elastic/beats/filebeat/fields.yml
    No fields files for module apache2
    Generated fields.yml for filebeat to /root/go/src/github.com/elastic/beats/filebeat/fields.yml

Building filebeat.yml for linux/amd64
Building filebeat.reference.yml for linux/amd64
Building filebeat.docker.yml for linux/amd64
Generated fields.yml for filebeat to /root/go/src/github.com/elastic/beats/filebeat/build/fields/fields.all.yml`

I have no idea where to put fiels.all.yml file, documentation writer assume custom filebeat module generator must be an ex-elastic employee.

Anyway, I copied the generated module directory (/root/go/src/github.com/elastic/beats/filebeat/module/unixhops) to installed filebeat directory (/usr/share/filebeat/module) and enabled the module.

In kibana able to pull the log, but kibana not able to detect any field.
For kibana, pipeline pattern is not matching at all

Here is the pipeline configuration
{ "description": "Pipeline for parsing Unixhops HTTP Server access logs. Requires the geoip and user_agent plugins.", "processors": [{ "grok": { "field": "message", "patterns":[ "%{IP:unixhops.access.ip} \\[%{MONTHDAY}/%{MONTH}/%{YEAR}:%{HOUR}:%{MINUTE} %{INT}\\] %{WORD:unixhops.access.loglevel} %{WORD:unixhops.access.access_id} \"%{WORD:unixhops.access.method} %{DATA:unixhops.access.url_original} HTTP/%{NUMBER:unixhops.access.http_version}\" %{NUMBER:unixhops.access.http_response_status_code:long} %{NUMBER:unixhops.access.byte1} %{NUMBER:unixhops.access.byte2} \"\" \"%{DATA:unixhops.access.user_agent}\" %{GREEDYDATA:unixhops.access.request_start} %{GREEDYDATA:unixhops.access.request_end}", "%{IP:unixhops.access.ip} \\[%{MONTHDAY}/%{MONTH}/%{YEAR}:%{HOUR}:%{MINUTE} %{INT}\\] %{LOGLEVEL:unixhops.access.loglevel} %{WORD:unixhops.access.access_id} \"Request received!\"", "%{IP:unixhops.access.ip} \\[%{MONTHDAY}/%{MONTH}/%{YEAR}:%{HOUR}:%{MINUTE} %{INT}\\] %{LOGLEVEL:unixhops.access.loglevel} %{WORD:unixhops.access.access_id} \"Execution Starts - %{DATA:unixhops.access.url_original}\"", "%{IP:unixhops.access.ip} \\[%{MONTHDAY}/%{MONTH}/%{YEAR}:%{HOUR}:%{MINUTE} %{INT}\\] %{LOGLEVEL:unixhops.access.loglevel} %{WORD:unixhops.access.access_id} \"Execution Ends - %{DATA:unixhops.access.url_original} tooks %{GREEDYDATA:unixhops.access.time_taken}\"" ], "ignore_missing": true } },{ "remove":{ "field": "message" } }, { "rename": { "field": "@timestamp", "target_field": "event.created" } }, { "date": { "field": "unixhops.access.time", "target_field": "@timestamp", "formats": ["dd/MMM/yyyy:H:m:s Z"], "ignore_failure": true } }, { "remove": { "field": "unixhops.access.time", "ignore_failure": true } }], "on_failure" : [{ "set" : { "field" : "error.message", "value" : "{{ _ingest.on_failure_message }}" } }] }

Here is the kibana output

@timestamp May 29, 2019 @ 23:20:11.478 t _id cmy2BGsB3nqo1iWWvpd_ t _index filebeat-7.1.0 # _score - t _type _doc t agent.ephemeral_id 70efc0c5-5c19-47d5-849c-07897b90ad83 t agent.hostname elk-filebeat t agent.id f3af7ad9-6715-4e33-ab92-b26b398086a2 t agent.type filebeat t agent.version 7.1.0 t ecs.version 1.0.0 t error.message Provided Grok expressions do not match field value: [203.189.181.129 [27/May/2019:05:26 +0000] ACCESS BE3A4491 \"GET /v1.0/locations/ HTTP/1.1\" 200 0 0 \"\" \"Apache-HttpClient/4.5.6 (Java/1.8.0_201)\" 5.894 5.853] t event.dataset unixhops.access t event.module unixhops t fileset.name access t host.architecture x86_64 host.containerized false t host.hostname elk-filebeat t host.id b64bdeba70914c4d95c1b654d55e04a7 t host.name elk-filebeat t host.os.codename xenial t host.os.family debian t host.os.kernel 4.4.0-148-generic t host.os.name Ubuntu t host.os.platform ubuntu t host.os.version 16.04.6 LTS (Xenial Xerus) t input.type log t log.file.path /var/log/ecp_cloud/ecp_cloud.log # log.offset 9,364 t message 203.189.181.129 [27/May/2019:05:26 +0000] ACCESS BE3A4491 "GET /v1.0/locations/ HTTP/1.1" 200 0 0 "" "Apache-HttpClient/4.5.6 (Java/1.8.0_201)" 5.894 5.853 t service.type unixhops

Please note

  1. filebeat is installed in seperate server than other ELK stack component
  2. While executing first make create-module MODULE={module} command {module}/_meta/kibana directory was never created

Any help will be appreciated.

It looks like it's having trouble because the log lines it's seeing don't match the grok pattern. It's trying to match this pattern:

%{IP:unixhops.access.ip} \\[%{MONTHDAY}/%{MONTH}/%{YEAR}:%{HOUR}:%{MINUTE} %{INT}\\] %{WORD:unixhops.access.loglevel} %{WORD:unixhops.access.access_id} \"%{WORD:unixhops.access.method} %{DATA:unixhops.access.url_original} HTTP/%{NUMBER:unixhops.access.http_version}\" %{NUMBER:unixhops.access.http_response_status_code:long} %{NUMBER:unixhops.access.byte1} %{NUMBER:unixhops.access.byte2} \"\" \"%{DATA:unixhops.access.user_agent}\" %{GREEDYDATA:unixhops.access.request_start} %{GREEDYDATA:unixhops.access.request_end}

against this input line:

203.189.181.129 [27/May/2019:05:26 +0000] ACCESS BE3A4491 \"GET /v1.0/locations/ HTTP/1.1\" 200 0 0 \"\" \"Apache-HttpClient/4.5.6 (Java/1.8.0_201)\" 5.894 5.853

and failing, so this might be an issue with your pattern? Perhaps try the buggy line out in the grok debugger?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.