I am trying to import csv into elasticsearch using logstash. I have a field in the csv with is an IP:Port pair. I would like to split the IP and port, and assign them to two new keys though the logstash config file.
I looked into kv , but kv seems to require the keys and values pair in the string. Mine is only value.
Sample csv data:
hostname1,172.20.28.11:8080,proxy_web,false
hostname2,172.20.28.20:443,web_https,false
In the above case, I already map the second field as "IP_Port" (using csv=> column) Now, I want to split 172.20.28.11:8080 as IP=172.20.28.11 and port=8080 (key=IP,value=172.20.28.11) and alike for port. Then mutate the datatype for the IP as 'ip datatype'.
After that, I can include the IP to geoip conversion using the following snippet, if I want to convert that extracted IP and use it to do geomapping.
geoip {
source => "Source IP"
target => "geoip"
database => "/etc/logstash/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}
Any suggestions on this?