Filebeat osquery module

ES - 6.2
LS - 6.2
Filebeat - osquery module
OS - Mac

2 questions

  1. What is the benefit of running a module instead of just a prospector? Besides the auto index creation and visualizations. If those are not needed then why use a module?

  2. I am testing the osquery module for filebeat. I have everything working but all of my fields are prepended with json.<fieldname>.

Example:

json.columns.cmdline|       |/Applications/Microsoft Excel.app/Contents/MacOS/Microsoft Excel -psn_0_827594|
|---|---|---|
json.columns.cpu_time_kernel|     |  4170|
json.columns.cpu_time_user|     |  5420|

How can I remove the json. from the front of all my field names?

modules.d/osquery.yml

- module: osquery                                                                                                                                                                            
  result:
    enabled: true
    var.paths: ["/var/log/osquery/osqueryd.results.log"]

    # If true, all fields created by this module are prefixed with
    # `osquery.result`. Set to false to copy the fields in the root
    # of the document. The default is true.
    var.use_namespace: false

filebeat.yml

#=========================== Filebeat prospectors =============================

filebeat.prospectors:

tags: ["osquery_filebeat_test"]

processors:
- drop_fields:
    fields: ["beat.name", "beat.hostname", "beat.version", "beat", "host", "input_type", "source", "prospector.type"]

#============================= Filebeat modules ===============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s
#================================ Outputs =====================================


#--------------------------- Logstash output -------------------------------
output.logstash:
  hosts: ["DOMAIN.com:443"]
  ssl.certificate_authorities: ["/Applications/Filebeat/certs/cert.crt"]
  loadbalance: true
#  ttl: 300

#================================ Logging =====================================

logging.level: info
logging.to_files: true

Modules will also send your logs to an ingest pipeline that will process them into structured fields (instead of the raw log message).

Setting var.use_namespace: true will remove the json in favor of a osquery namespace.

So it's have osquery. Or json. On all the logs. There is no way to get them as they are in the logs. Filebeat module will prepend them with something. Seems counter productive.

I am sending my logs to a logstash node that processes them. What is the benefit of an ingest node. Right now I use a logstash ingest pipeline and a custom elasticsearch template. What is the difference between that and a module.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.