Tag a message on the filebeat side to be able to filter on kibana ( HTTP response codes )

Hi,

I have this configuration:

filebeat.prospectors:
- type: log
  enabled: true
  paths:
    - /var/log/messages
    - /var/log/secure
    - /var/log/audit/audit.log
    - /var/log/yum.log
    - /root/.bash_history
    - /var/log/neutron/*.log
    - /var/log/nova/*.log
    - /var/log/keystone/keystone.log
    - /var/log/httpd/error_log
    - /var/log/mariadb/mariadb.log
    - /var/log/glance/*.log
    - /var/log/rabbitmq/*.log
  ignore_older: 72h
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false
output.logstash:
  hosts: ["sdsds"]

I would like to tag a log if it contains the following patter:
message:*INFO*HTTP*200*
I want to create a query on kibana to filter based on http response codes tag. How can I create this? Can you help me to create the condition with tags?
This response codes are in the nova-api and neutron server logs.
And I don't want to actually filter out the logs, I want to have everything in elastic search, just want to add tag to these kind of logs.

I managed to figure out something, but I'm not sure what is the best way to list it, because I have many response codes:

filebeat.prospectors:
- type: log
  enabled: true
  paths:
    - /var/log/messages
    - /var/log/secure
    - /var/log/audit/audit.log
    - /var/log/yum.log
    - /root/.bash_history
    - /var/log/neutron/*.log
    - /var/log/keystone/keystone.log
    - /var/log/httpd/error_log
    - /var/log/mariadb/mariadb.log
    - /var/log/glance/*.log
    - /var/log/rabbitmq/*.log
- type: log
  enabled: true
  paths:
    - /var/log/nova/*.log
  include_lines: ["status: 200"]
  fields_under_root: true
  fields:
    httpresponsecode: 200
  ignore_older: 72h
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false
output.logstash:

I have to create multiple times these 4 lines?

filebeat.prospectors:
- type: log
  enabled: true
  paths:
    - /var/log/messages
    - /var/log/secure
    - /var/log/audit/audit.log
    - /var/log/yum.log
    - /root/.bash_history
    - /var/log/keystone/keystone.log
    - /var/log/neutron/*.log
    - /var/log/httpd/error_log
    - /var/log/mariadb/mariadb.log
    - /var/log/glance/*.log
    - /var/log/rabbitmq/*.log
- type: log
  enabled: true
  paths:
    - /var/log/nova/*.log
  fields_under_root: true
  include_lines: ["status: 200"]
  fields:
    httpresponsecode: 200
- type: log
  enabled: true
  paths:
    - /var/log/nova/*.log
  fields_under_root: true
  include_lines: ["status: 202"]
  fields:
    httpresponsecode: 202
- type: log
  enabled: true
  paths:
    - /var/log/nova/*.log
  fields_under_root: true
  include_lines: ["status: 204"]
  fields:
    httpresponsecode: 204
- type: log
  enabled: true
  paths:
    - /var/log/nova/*.log
  fields_under_root: true
  include_lines: ["status: 207"]
  fields:
    httpresponsecode: 207
- type: log
  enabled: true
  paths:
    - /var/log/nova/*.log
  fields_under_root: true
  include_lines: ["status: 403"]
  fields:
    httpresponsecode: 403
- type: log
  enabled: true
  paths:
    - /var/log/nova/*.log
  fields_under_root: true
  include_lines: ["status: 404"]
  fields:
    httpresponsecode: 404
- type: log
  enabled: true
  paths:
    - /var/log/nova/*.log
  fields_under_root: true
  include_lines: ["status: 500"]
  fields:
    httpresponsecode: 500
- type: log
  enabled: true
  paths:
    - /var/log/nova/*.log
  fields_under_root: true
  include_lines: ["HTTP 503"]
  fields:
    httpresponsecode: 503
  ignore_older: 72h
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false
output.logstash:
  hosts: [

What is the best way to do this to multiple files and multiple codes?

Could you please share a few example logs you would like to tag?
I am afraid that tagging might need to happen on Logstash side or Elasticsearch side.

Also, do not configure the same path for different prospectors.

2018-11-27 08:19:57.587 116784 INFO nova.metadata.wsgi.server 10.118.220.2,10.118.220.228 "GET /metadata/v1/maintenance HTTP/1.1" status: 404 len: 176 time: 0.0033870
2018-11-28 08:21:51.402 116783 INFO nova.metadata.wsgi.server 10.118.220.2,10.118.220.228 "GET /metadata/v1/maintenance HTTP/1.1" status: 404 len: 176 time: 0.0038850
2018-11-28 14:11:55.992 116784 INFO nova.metadata.wsgi.server 10.118.220.2,10.118.220.228 "GET /metadata/v1/maintenance HTTP/1.1" status: 404 len: 176 time: 0.0041270

2018-11-22 04:02:07.951 116783 INFO nova.metadata.wsgi.server 10.118.220.2,10.118.220.228 "GET /latest/user-data/ HTTP/1.1" status: 200 len: 171 time: 0.0090339
2018-11-22 04:02:07.991 116785 INFO nova.metadata.wsgi.server 10.118.220.2,10.118.220.228 "GET /latest/user-data/ HTTP/1.1" status: 200 len: 171 time: 0.0037639
2018-11-22 04:02:08.132 116781 INFO nova.metadata.wsgi.server 10.118.220.2,10.118.220.228 "GET /latest/user-data/ HTTP/1.1" status: 200 len: 171 time: 0.0040569

My solution doesn't work, at the beginning it is sending and after completely stops.
When you say please don't specifiy the same path to different prospector, how can I work with the same files but based on different filters and tagging?
I hope you can help me.

Unfortunately, Filebeat is not capable of such filtering and parsing. I suggest you try to use Logstash. It has more advanced filtering, parsing and data enrichment capabilites than Filebeat.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.