I listen to my app with a filebeat that watches the application logs. To parse my custom log lines, I have a pipeline in my elasticsearch _ingest.
Some of the lines have a IP address, some others have not. I want to geoip the IP address. but it seems to fail. How can I run the geoip only if there is a IP field? I have tried:
- create 2 prospectors and add a field. But the 2 prospectors are on the same log file, which makes filebeat fail (https://www.elastic.co/guide/en/beats/filebeat/current/multiple-prospectors.html)
- create a processor, but I did not find a way to add a field with it
any solution? I cannot imagine I have to use logstash?
Here is my pipeline
"processors": [
{
"grok": {
"field": "message",
"patterns": [
"%{TIMESTAMP_ISO8601:logdate},... %{LOGLEVEL:level} +~ %{EMAILLOCALPART:who}@%{HOSTNAME:company} ..%{IPORHOST:ip}..: .....%{URIPATHPARAM:route} performed in %{INT:req_ms:int} ms",
"%{TIMESTAMP_ISO8601:logdate},... %{LOGLEVEL:level} +~ perfutils layout#%{INT:decode_layout:int} msg#%{INT:decode_msg:int} decode:%{INT:decode_ms:int}ms IFTs:%{INT:ifts_ms:int}ms",
"%{TIMESTAMP_ISO8601:logdate},... %{LOGLEVEL:level} +~ %{GREEDYDATA:text}"
]
}
},
{
"geoip" : {
"field" : "ip"
}
}
]