I see what you are saying here. But i don't really know how to exactly see the incoming messages into elasticsearch.
I'm going to try setup a wireshark in the local loop to see the messages.
EDIT : There is a RAW capture from my server. This is captured in the local loop (i'm just pasting 1 line of the send and the answer) :
POST /_bulk/?pipeline=geoip HTTP/1.1
Host: 9.8.7.6:9200
Content-Length: 18063
User-Agent: Fluent-Bit
Content-Type: application/x-ndjson
{"index":{"_index":"test-2019-03-25","_type":"flb_type"}}
{"@timestamp":"2019-03-25T15:14:14.038Z", "ID_Firewall":"MY-FIREWALL", "timestamp":"2019-03-25 16:14:14", "IP_Firewall":"1.1.1.1", "Niveau":"6", "Description":"Connection Opened", "IP_Source":"2.2.2.2", "Port_Source":"45724", "INT_Source":"X1", "NAT_Source":"3.3.3.3", "NAT_Port_Source":"45724", "IP_Destination":"4.4.4.4", "Port_Destination":"443", "INT_Destination":"X0", "NAT_Destination":"5.5.5.5", "NAT_Port_Destination":"443", "Protocole":"tcp/https"}
HTTP/1.1 200 OK
content-type: application/json; charset=UTF-8
content-length: 6901
{"took":41,"ingest_took":46,"errors":false,"items":[{"index":{"_index":"test-2019-03-25","_type":"flb_type","_id":"rFxqtWkBJGtfi-AKlbyT","_version":1,"result":"created","_shards":{"total":1,"successful":1,"failed":0},"_seq_no":41887,"_primary_term":1,"status":201}},{"index":{"_index":"test-2019-03-25","_type":"flb_type","_id":"rVxqtWkBJGtfi-AKlbyT","_version":1,"result":"created","_shards":{"total":1,"successful":1,"failed":0},"_seq_no":42084,"_primary_term":1,"status":201}}, ...
I also test this with the pipeline configured and i still got the exact same error here.
{"index":{"_index":"test-2019-03-25","_type":"flb_type","_id":"X2V5tWkBJGtfi-AKnY0l","status":400,"error":{"type":"mapper_parsing_exception","reason":"failed to parse field [geolocalisation] of type [geo_point]","caused_by":{"type":"array_index_out_of_bounds_exception","reason":"0"}}}},
Sometimes the value is even to big for the geo_point field ! :
,{"index":{"_index":"test-2019-03-25","_type":"flb_type","_id":"dmV5tWkBJGtfi-AKnY0l","status":400,"error":{"type":"mapper_parsing_exception","reason":"failed to parse field [geolocalisation] of type [geo_point]","caused_by":{"type":"illegal_argument_exception","reason":"illegal latitude value [-122.3321] for geolocalisation"}}}},
Does this mean the processor is working but can't write correctly the field with the "set" processor on incoming messages from another service than Elasticsearch himself ?
It's like it can't find the "lat" and "lon" value when it's processed this way