Filebeat Logstash Module

Hi guys,

I'm trying to add a log to Logstash module in Filebeat, but can't succeed. Can anybody please give me some hint what's wrong in my config?

Example of Log Entry

2021-02-23T12:44:49.000Z,Warning,1 2021-02-23T12:44:49.000Z ntb81.company.com KES|11.0.0.0 - 0000013a [event@23668 et="0000013a" tdn="File Threat Protection" etdn="Object not processed" hdn="NTB81" hip="192.168.0.15" gn="0 - PEDERSEN KES 11"] Event type:     Object not processed\r\nApplication:     Windows Explorer\r\nApplication\Name:     explorer.exe\r\nApplication\Path:     C:\Windows\\r\nApplication\Process ID:     7988\r\nUser:     Company\user.last (Active user)\r\nComponent:     File Threat Protection\r\nResult\Description:     Untreated\r\nObject:     C:\Users\user.last\AppData\Roaming\Microsoft\Windows\Recent\AutomaticDestinations\a7bd71699cd38d1c.automaticDestinations-ms\r\nObject\Type:     File\r\nObject\Path:     C:\Users\user.last\AppData\Roaming\Microsoft\Windows\Recent\AutomaticDestinations\a7bd71699cd38d1c.automaticDestinations-ms\r\nObject\Name:     a7bd71699cd38d1c.automaticDestinations-ms\r\nReason:     Size\r\n

Filebeat.yml config

>     # ============================== Filebeat inputs ===============================
> 
> filebeat.inputs:
> 
> # Each - is an input. Most options can be set at the input level, so
> # you can use different inputs for various configurations.
> # Below are the input specific configurations.
> 
> - type: log
> 
>   # Change to true to enable this input configuration.
>   enabled: true
> 
>   # Paths that should be crawled and fetched. Glob based paths.
>   paths:
>     - C:\ProgramData\Syslog Watcher\Storage\192.168.1.8\KSC.log
setup.kibana:

  # Kibana Host
  # Scheme and port can be left out and will be set to the default (http and 5601)
  # In case you specify and additional path, the scheme is required: http://localhost:5601/path
  # IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
  #host: "localhost:5601"
  host: "https://192.168.1.70:5601"
  ssl.enabled: true
  #ssl.verification_mode: none
  ssl.verification_mode: certificate
  ssl.certificate_authorities: C:\Program Files\Winlogbeat\kibana.pem
  ssl.certificate: C:\Program Files\Winlogbeat\ca\ca.crt
  ssl.key: C:\Program Files\Winlogbeat\ca\ca.key

# ---------------------------- Elasticsearch Output ----------------------------
output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["192.168.1.70:9200"]

  # Protocol - either `http` (default) or `https`.
  protocol: "https"
  ssl.verification_mode: certificate
  ssl.certificate_authorities: C:\Program Files\Winlogbeat\http2.pem

  # Authentication credentials - either API key or username/password.
  #api_key: "id:api_key"
  #username: "elastic"
  #password: "changeme"
  username: "elastic"
  password: "password!"

# ------------------------------ Logstash Output -------------------------------
#output.logstash:
  # The Logstash hosts
  #hosts: ["localhost:5044"]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

Logstash.yml config

> 
> - module: logstash
>   # logs
>   log:
>     enabled: true
> 
>     # Set custom paths for the log files. If left empty,
>     # Filebeat will choose the paths depending on your OS.
>     var.paths: ['C:\ProgramData\Syslog Watcher\Storage\192.168.1.8\KSC.log']
>     input:
>       pipeline: C:\Program Files\Filebeat\module\logstash\log\ingest\pipeline.yml
> 
>   # Slow logs
>   slowlog:
>     enabled: false
>     # Set custom paths for the log files. If left empty,
>     # Filebeat will choose the paths depending on your OS.
>     #var.paths:

Logstash Pipeline.yml (C:\Program Files\Filebeat\module\logstash\log\ingest\pipeline.yml)

> processors:
> - set:
>     field: event.ingested
>     value: '{{_ingest.timestamp}}'
> - rename:
>     field: '@timestamp'
>     target_field: event.created
> - grok:
>     field: message
>     patterns:
>       - '%{TIMESTAMP_ISO8601:eventLogTime},%{WORD:Severity},(1) %{TIMESTAMP_ISO8601} %{IPORHOST:sourcehost}.*tdn=\"%{DATA:Module.Name}\".*etdn=\"%{DATA:Module.Info}\" hdn=\"%{DATA:CompName}\" hip=\"%{IPORHOST:sourceip}\" gn=\"%{DATA:group}\"] Event type: %{GREEDYDATA:log}'
> - pipeline:
>     if: ctx.first_char != '{'
>     name: '{< IngestPipeline "pipeline-plaintext" >}'
> - pipeline:
>     if: ctx.first_char == '{'
>     name: '{< IngestPipeline "pipeline-json" >}'
> - remove:
>     field:
>       - first_char
> on_failure:
>   - set:
>       field: error.message
>       value: '{{ _ingest.on_failure_message }}'
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.