the former question is because my filebeat config not create the json pipeline in es.
I send it by modify the default format.
Now I have new questions ,as described in the issue.
I cannot create the json pipeline by the config shown in document.
Q1:
Can anyone tell me where the default pipeline from
or how can I use custom pipeline for specific module from filebeat side?
Q2:
I cloned the source code, but i donot find how the module config parsed.
update at
2018-12-06 09:34:09
===================
former question
===================
Hout to config filebeat 6.5.1 logstash modules json decode?
filebeat version: 6.5.1
elk : 6.5.1
I use the logstash modules
to tranfer my application's log to ELK.
but it's not parsed in elk.
how can i parse the json in filebeat side?
In Kibana, the message body is not parsed , instead it is treated like a string.
config :
// spring-logback.xml
<appender name="STASH" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>logback/center.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>logback/center.%d{yyyy-MM-dd}.log</fileNamePattern>
<maxHistory>7</maxHistory>
</rollingPolicy>
<encoder class="net.logstash.logback.encoder.LogstashEncoder"/>
</appender>
// ./modules.d/logstash.yml
- module: logstash
# logs
log:
enabled: true
# Set custom paths for the log files. If left empty,
# Filebeat will choose the paths depending on your OS.
var.paths: ["/Users/youzipi/projects/0/ace-security/logback/*.log"]
# format: json # i tried this, not worked.
i tried configure json decode
processors:
- decode_json_fields:
fields: ["logstash"]
target: ""
overwrite_keys: true
but it throw an error : Exiting: error unpacking config data: required 'object', but found 'string' in field 'processors.0.target' (source:'filebeat.yml')
I saw the document example, it's a empty string there.