Hi, I am trying to make my own module, but I have a problem right on the beginning. I tried to write my own pipeline.json file, but after command execution I get this error:
filebeat -e --modules test -setup
2017/10/09 07:43:43.391721 beat.go:297: INFO Home path: [C:\Users\miroslav.kudlac\Desktop\filebeat-5.6.2-windows-x86_64] Config path: [C:\Users\miroslav.kudlac\Desktop\filebeat-5.6.2-windows-x86_64] Data path: [C:\Users\miroslav.kudlac\Desktop\filebeat-5.6.2-windows-x86_64\data] Logs path: [C:\Users\miroslav.kudlac\Desktop\filebeat-5.6.2-windows-x86_64\logs]
2017/10/09 07:43:43.391721 metrics.go:23: INFO Metrics logging every 30s
2017/10/09 07:43:43.392722 beat.go:192: INFO Setup Beat: filebeat; Version: 5.6.2
2017/10/09 07:43:43.393725 output.go:258: INFO Loading template enabled. Reading template file: C:\Users\miroslav.kudlac\Desktop\filebeat-5.6.2-windows-x86_64\filebeat.template.json
2017/10/09 07:43:43.395723 output.go:269: INFO Loading template enabled for Elasticsearch 2.x. Reading template file: C:\Users\miroslav.kudlac\Desktop\filebeat-5.6.2-windows-x86_64\filebeat.template-es2x.json
2017/10/09 07:43:43.398722 output.go:281: INFO Loading template enabled for Elasticsearch 6.x. Reading template file: C:\Users\miroslav.kudlac\Desktop\filebeat-5.6.2-windows-x86_64\filebeat.template-es6x.json
2017/10/09 07:43:43.400721 client.go:128: INFO Elasticsearch url: http://localhost:9200
2017/10/09 07:43:43.401720 outputs.go:108: INFO Activated elasticsearch as output plugin.
2017/10/09 07:43:43.402720 publish.go:300: INFO Publisher name: FESK-LTP0089
2017/10/09 07:43:43.418726 async.go:63: INFO Flush Interval set to: 1s
2017/10/09 07:43:43.418726 async.go:64: INFO Max Bulk Size set to: 50
2017/10/09 07:43:43.457247 filebeat.go:46: INFO Enabled modules/filesets: test (error)
2017/10/09 07:43:43.491243 client.go:128: INFO Elasticsearch url: http://localhost:9200
2017/10/09 07:43:43.519241 client.go:667: INFO Connected to Elasticsearch version 5.6.2
2017/10/09 07:44:13.395145 metrics.go:39: INFO Non-zero metrics in the last 30s: libbeat.es.publish.read_bytes=433 libbeat.es.publish.write_bytes=321
2017/10/09 07:44:43.408291 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/10/09 07:44:58.425241 beat.go:331: INFO Kibana dashboards successfully loaded.
2017/10/09 07:44:58.426238 client.go:128: INFO Elasticsearch url: http://localhost:9200
2017/10/09 07:44:58.443242 client.go:667: INFO Connected to Elasticsearch version 5.6.2
2017/10/09 07:44:58.446742 beat.go:233: INFO filebeat start running.
2017/10/09 07:44:58.447243 client.go:128: INFO Elasticsearch url: http://localhost:9200
2017/10/09 07:44:58.466247 client.go:667: INFO Connected to Elasticsearch version 5.6.2
2017/10/09 07:44:58.474251 metrics.go:51: INFO Total non-zero values: libbeat.es.publish.read_bytes=28358 libbeat.es.publish.write_bytes=141868
2017/10/09 07:44:58.474251 metrics.go:52: INFO Uptime: 1m15.0875224s
2017/10/09 07:44:58.475264 beat.go:237: INFO filebeat stopped.
2017/10/09 07:44:58.475264 beat.go:346: CRIT Exiting: Error loading pipeline for fileset test/error: couldn't load pipeline: couldn't load json. Error: 400 Bad Request. Response body: {"error":{"root_cause":[{"type":"parse_exception","reason":"[processors] required property is missing","header":{"property_name":"processors"}}],"type":"parse_exception","reason":"[processors] required property is missing","header":{"property_name":"processors"}},"status":400}
Exiting: Error loading pipeline for fileset test/error: couldn't load pipeline: couldn't load json. Error: 400 Bad Request. Response body: {"error":{"root_cause":[{"type":"parse_exception","reason":"[processors] required property is missing","header":{"property_name":"processors"}}],"type":"parse_exception","reason":"[processors] required property is missing","header":{"property_name":"processors"}},"status":400}
which is trully strange, because when I tried to run pipeline from restclient, everything went fine:
[POST] http://localhost:9200/_ingest/pipeline/_simulate
{
"pipeline": {
"description": "Pipeline for parsing MySQL slow logs.",
"processors": [
{
"grok": {
"field": "message",
"patterns": [
"%{DATA:time} %{NUMBER:thread} \[%{DATA:log_level}\] %{GREEDYDATA:log_message}"
]
}
}
]
},
"docs": [
{
"_source": {
"message": "2017-09-26T13:10:01.850659Z 0 [Note] Plugin mysqlx reported: 'Server starts handling incoming connections'"
}
}
]
}
RESPONSE:
{"docs":[{"doc":{"_index":"_index","_type":"_type","_id":"_id","_source":{"log_level":"Note","log_message":"Plugin mysqlx reported: 'Server starts handling incoming connections'","time":"2017-09-26T13:10:01.850659Z","thread":"0","message":"2017-09-26T13:10:01.850659Z 0 [Note] Plugin mysqlx reported: 'Server starts handling incoming connections'"},"_ingest":{"timestamp":"2017-10-09T07:37:36.353Z"}}}]}
my pipeline.json file looks like:
{
"description": "Pipeline for parsing MySQL error logs.",
"pipeline": {
"processors": [
{
"grok": {
"field": "message",
"patterns": [
"%{DATA:time} %{NUMBER:thread} \[%{DATA:log_level}\] %{GREEDYDATA:log_message}"
],
"ignore_missing": false
}
}
]
}
}
I have downloaded filebeat-5.6.2-windows-x86_64. Also what should I do in next step, if I want to configure what indices and dashboards/searches/visualizations should be created after I run my test module... I am reading about import_dashboards, i see there is some import_dashboard.exe file, and here
https://www.elastic.co/guide/en/beats/libbeat/current/import-dashboards.html is some manual to import some dashboards... question is how to link module in the way he knows what index should be created or what files should be imported via import_dashboard...