I am getting the error whenever I try ingesting data into elasticsearch using filebeat.
2019-02-21T07:57:02.190Z INFO elasticsearch/client.go:721 Connected to Elasticsearch version 6.5.4
2019-02-21T07:57:02.193Z INFO template/load.go:130 Template already exists and will not be overwritten.
2019-02-21T07:57:02.193Z INFO instance/beat.go:894 Template successfully loaded.
2019-02-21T07:57:02.193Z INFO pipeline/output.go:105 Connection to backoff(elasticsearch(http://localhost:9200)) established
2019-02-21T07:57:04.513Z ERROR pipeline/output.go:121 Failed to publish events: temporary bulk send failure
2019-02-21T07:57:04.513Z INFO pipeline/output.go:95 Connecting to backoff(elasticsearch(http://localhost:9200))
2019-02-21T07:57:04.515Z INFO elasticsearch/client.go:721 Connected to Elasticsearch version 6.5.4
2019-02-21T07:57:04.519Z INFO template/load.go:130 Template already exists and will not be overwritten.
This is the pipeline.json
used for defining pipeline:
{
"description": "Pipeline for ingest node",
"processors": [
{
"grok": {
"field": "message",
"patterns": [
"%{IP:source_ip} %{GREEDYDATA} \\[%{HTTPDATE:request_date}\\] \\\"%{WORD:http_method} %{URIPROTO:http_proto}://%{URIHOST:uri_host}%{URIPATH:uri_path}%{GREEDYDATA:uri_query} http/%{NUMBER:http_version}\\\" %{NUMBER:response_code} %{NUMBER:bytes_sent:int} %{NUMBER:origin_response_code} %{NUMBER:origin_bytes_sent} %{NUMBER:client_req_content_length} %{NUMBER:proxy_req_length} %{NUMBER:client_req_header_length} %{NUMBER:proxy_resp_header_length} %{NUMBER:proxy_req_header_length} %{NUMBER:origin_header_resp_length} %{NUMBER:time_to_serve:} %{NUMBER:origin_time_to_serve:} %{WORD:proxy_hierarchy_route} %{WORD:finish_status_client} %{WORD:finish_status_origin} %{WORD:cache_result_code} \\\"%{GREEDYDATA:user_agent}\\\" %{GREEDYDATA:x_play_back_session_id}",
"%{IP:source_ip} %{GREEDYDATA} \\[%{HTTPDATE:request_date}\\] \\\"%{WORD:http_method} %{URIPROTO:http_proto}://%{URIHOST:uri_host}%{URIPATH:uri_path}%{GREEDYDATA:uri_query} http/%{NUMBER:http_version}\\\" %{NUMBER:response_code} %{NUMBER:bytes_sent:int} %{NUMBER:origin_response_code} %{NUMBER:origin_bytes_sent:int} %{NUMBER:client_req_content_length} %{NUMBER:proxy_req_length} %{NUMBER:client_req_header_length} %{NUMBER:proxy_resp_header_length} %{NUMBER:proxy_req_header_length} %{NUMBER:origin_header_resp_length} %{NUMBER:time_to_serve:} %{NUMBER:origin_time_to_serve:} %{WORD:proxy_hierarchy_route} %{WORD:finish_status_client} %{WORD:finish_status_origin} %{WORD:cache_result_code} %{GREEDYDATA:user_agent}"
],
"on_failure": [
{
"grok": {
"field": "message",
"patterns": ["%{IP:source_ip} %{GREEDYDATA} \\[%{HTTPDATE:request_date}\\] \\\"%{WORD:http_method} %{URIPROTO:http_proto}://%{URIHOST:uri_host}%{URIPATH:uri_path}%{GREEDYDATA:uri_query} http/%{NUMBER:http_version}\\\" %{NUMBER:response_code} %{NUMBER:bytes_sent} %{NUMBER:origin_response_code} %{NUMBER:origin_bytes_sent} %{NUMBER:client_req_content_length} %{NUMBER:proxy_req_length} %{NUMBER:client_req_header_length} %{NUMBER:proxy_resp_header_length} %{NUMBER:proxy_req_header_length} %{NUMBER:origin_header_resp_length} %{NUMBER:time_to_serve:} %{NUMBER:origin_time_to_serve:} %{WORD:proxy_hierarchy_route} %{WORD:finish_status_client} %{WORD:finish_status_origin} %{WORD:cache_result_code} %{GREEDYDATA:user_agent}"]
}
}
]
}
},
{
"convert": {
"field": "bytes_sent",
"type": "integer"
}
},
{
"dissect": {
"field": "uri_path",
"if": "(ctx.uri_path.contains(\"hls5\") && ctx.uri_path.contains(\"live\") && (ctx.uri_path.contains(\"m3u8\") || ctx.uri_path.contains(\"ts\"))) || (ctx.uri_path.contains(\"dash\") && ctx.uri_path.contains(\"live\") && ctx.uri_path.contains(\"m4s\"))",
"pattern": "/%{a}/%{protocol}/%{stream_type}/%{backend_channel_id}/%{e}/%{variant}/%{g}.%{h}"
}
},
{
"remove": {
"field": [
"a",
"e",
"g",
"h"
]
}
}
]
}
This is the curl command using pipeline.json
for PUT PIPELINE API:
curl -H 'Content-Type: application/json' -X PUT 'localhost:9200/_ingest/pipeline/test-pipeline' -d@pipeline.json
This is the filebeat.yml
file.
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/dump/log/*
exclude_lines:
- thumbnail
- pictures
- health
- stats
- alerts
- url_template
- resource
- config
include_lines:
- live
- vod
- data
output.elasticsearch:
hosts: ["localhost:9200"]
index: "test-pipeline-%{+yyyy.MM.dd}"
pipeline: "test-pipeline"
setup.template.name: "test-pipeline"
setup.template.pattern: "test-pipeline*"
setup:
kibana:
host: "localhost:5601"
dashboards:
index: "test-pipeline*"