Hello @Brandon_Kobel. We are using filebeat on windows. We are forwarding messages using filebeat to ELK. The messages are coming in as is from the JSON Index. I am wondering If we need to change anything in the filebeat.yml file. I am pasting the configuration below. please take a look at it and let me know if I need to change anything in order to break the messages into individual JSON events.
filebeat.prospectors:
Each - is a prospector. Most options can be set at the prospector level, so
you can use different prospectors for various configurations.
Below are the prospector specific configurations.
-
input_type: log
Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/log/*.log
- C:\ProgramData\Qlik\Sense\Log\Script*.log
- C:\ProgramData\Qlik\Sense\Log\BrokerService*.log
- C:\ProgramData\Qlik\Sense\Log\HubService*.log
- C:\ProgramData\Qlik\Sense\Log\DataProfiling*.log
- C:\ProgramData\Qlik\Sense\Log\CapabilityService*.log
- C:\ProgramData\Qlik\Sense\Log\AppMigration*.log
- C:\ProgramData\Qlik\Sense\Log\Engine\Audit*.txt
- C:\ProgramData\Qlik\Sense\Log\Engine\Engine\Audit*.log
- C:\ProgramData\Qlik\Sense\Log\Engine\Engine\System*.log
- C:\ProgramData\Qlik\Sense\Log\Engine\Engine*.log
- C:\ProgramData\Qlik\Sense\Log\Engine\System*.txt
- C:\ProgramData\Qlik\Sense\Log\Engine\Trace*.txt
- C:\ProgramData\Qlik\Sense\Log\Scheduler\Audit*.txt
- C:\ProgramData\Qlik\Sense\Log\Scheduler\System*.txt
- C:\ProgramData\Qlik\Sense\Log\Scheduler\Trace*.txt
- C:\ProgramData\Qlik\Sense\Log\AboutService*.log
- C:\ProgramData\Qlik\Sense\Log\Proxy\Audit*.txt
- C:\ProgramData\Qlik\Sense\Log\Proxy\System*.txt
- C:\ProgramData\Qlik\Sense\Log\Proxy\Trace*.txt
- C:\ProgramData\Qlik\Sense\Log\Repository\Audit*.txt
- C:\ProgramData\Qlik\Sense\Log\Repository\System*.txt
- C:\ProgramData\Qlik\Sense\Log\Repository\Trace*.txt
- C:\ProgramData\Qlik\Sense\Log\Printing\System*.txt
- C:\ProgramData\Qlik\Sense\Log\Printing\Trace*.txt
Exclude lines. A list of regular expressions to match. It drops the lines that are
matching any regular expression from the list.
#exclude_lines: ["^DBG"]
Include lines. A list of regular expressions to match. It exports the lines that are
matching any regular expression from the list.
#include_lines: ["^ERR", "^WARN"]
Exclude files. A list of regular expressions to match. Filebeat drops the files that
are matching any regular expression from the list. By default, no files are dropped.
#exclude_files: [".gz$"]
Optional additional fields. These field can be freely picked
to add additional information to the crawled log files for filtering
fields:
component: crslogtype: qlklog
fields_under_root: true
Type to be published in the 'type' field. For Elasticsearch output,
the type defines the document type these entries should be stored
in. Default: log
document_type: qlklog
Multiline options
Mutiline can be used for log messages spanning multiple lines. This is common
for Java Stack Traces or C-Line Continuation
The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
#multiline.pattern: ^[
Defines if the pattern set under pattern should be negated or not. Default is false.
#multiline.negate: false
Match can be set to "after" or "before". It is used to define if lines should be append to a pattern
that was (not) matched before or after or as long as a pattern is not matched based on negate.
Note: After is the equivalent to previous and before is the equivalent to to next in Logstash
#multiline.match: after
-
input_type: log
Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/log/*.log
- E:\goPII\logs\pii.log
- E:\goPII\logs\metrics.log
Optional additional fields. These field can be freely picked
to add additional information to the crawled log files for filtering
fields:
component: crs
logtype: gopiifields_under_root: true
processors:
- drop_fields:
fields: ["host"]
#================================ General =====================================
The name of the shipper that publishes the network data. It can be used to group
all the transactions sent by a single shipper in the web interface.
#name:
The tags of the shipper are included in their own field with each
transaction published.
#tags: ["service-X", "web-tier"]
Optional fields that you can specify to add additional information to the
output.
#fields:
env: staging
#================================ Outputs =====================================
Configure what outputs to use when sending the data collected by the beat.
Multiple outputs may be used.
#------------------------- Kafka output --------------------------------
output.kafka:
hosts: ["10.29.42.141:9092"]
message topic selection + partitioning
topic: 'waplab_logstash_crs_pda_clp'
partition.round_robin:
reachable_only: false
required_acks: 1
compression: gzip
max_message_bytes: 1000000
#-------------------------- Elasticsearch output ------------------------------
#output.elasticsearch:
Array of hosts to connect to.
hosts: ["localhost:9200"]
Optional protocol and basic auth credentials.
#protocol: "https"
#username: "elastic"
#password: "changeme"
#----------------------------- Logstash output --------------------------------
#output.logstash:
The Logstash hosts
#hosts: ["localhost:5044"]
Optional SSL. By default is off.
List of root certificates for HTTPS server verifications
#ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]
Certificate for SSL client authentication
#ssl.certificate: "/etc/pki/client/cert.pem"
Client Certificate Key
#ssl.key: "/etc/pki/client/cert.key"
#================================ Logging =====================================
Sets log level. The default log level is info.
Available log levels are: critical, error, warning, info, debug
logging.level: info
files.
logging.to_files: true
logging.files:
Configure the path where the logs are written. The default is the logs directory
under the home path (the binary location).
#path: /var/log/winlogbeat
path: C:\QlikShare\Filebeats\log\
The name of the files where the logs are written to.
name: filebeat
Configure log file size limit. If limit is reached, log file will be
automatically rotated
rotateeverybytes: 10485760 # = 10MB
Number of rotated log files to keep. Oldest files will be deleted first.
keepfiles: 7
At debug level, you can selectively enable logging only for some components.
To enable all selectors use ["*"]. Examples of other selectors are "beat",
"publish", "service".
#logging.selectors: ["*"]