Logstash help with geoip mapping from csv file

Hi, sorry for asking, but went through code, help, FAQ and i still cant make it work. I have CSV file with two fields which contains IP address. When importing from Kibana directly i can make this fields to be recognized as IP address, but no langitude and longitude added, and i have proper database with needed data added to Elasticsearch (version 6.x). I tryed going with logstash config file:
input {
file {
path => "c:/users/maki/downloads/dump.txt"
start_position => "beginning"
sincedb_path => "C:/users/maki/downloads/brojac.txt"
}
}
filter {
csv {
separator => ","
columns => ["Flgs","RunTime","IdleTime","Proto","Sport","Dport","sTos","dTos","sDSb","dDSb","sTtl","dTtl","TotPkts","SrcPkts","DstPkts","TotBytes","SrcBytes","DstBytes","Load","SrcLoad","DstLoad","Loss","SrcLoss","DstLoss","pLoss","SrcGap","DstGap","Rate","SrcRate","DstRate","Dir","State","SrcWin","DstWin","SrcTCPBase","DstTCPBase","TcpRtt","SynAck","AckDat","sMeanPktSz","dMeanPktSz","SrcAddr","DstAddr","StartTime","LastTime"]
}
geoip{
source => "SrcAddr","DstAddr"
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "security4"
}
stdout {}
}
I was using recommendation from this page:
https://www.elastic.co/blog/geoip-in-the-elastic-stack
Where they said that mutate or adding new field is not needed anymore, since geoip filter is doing this automatic now. But when trying to import i receive error with:
ConfigurationError", :message=>"Expected one of #, {, } at line 14, column 20 (byte 666) after filter
So if anyone willing to help and have spare time, it would be great.
Thank you in advance.

Try

geoip{ source => "SrcAddr" }

That will add a field called geoip which should show up in kibana as a geo_point. Then use a second filter for the second field

geoip { source => "DstAddr" target => "someField" }

That will not be a geo_point unless you add a mapping template. This thread might help with that. Then you can change the target of the first filter as well :slight_smile:

Hi, thank you for reply, appreciate it!
i did:
geoip{ source => "SrcAddr" }
and it worked, tryed to add below:
geoip{ source => "DstAddr" }
that didnt work, so it would be nice to know how i can do geoip for two fields, SrcAddr and DstAddr.
Now, i didnt get what i expected: while i see results in elasticsearch, they are not geoip type as it looks, i cant do mapping, it say there is no geohash available field in my new index. I have fields like:
geoip.coutry_name, geoip.longitude
but they have keyword at the end.Also, when importing from Kibana, i can say that field SrcAddr and DstAddr are IP type. When i did mapping for my index, i could not edit or alter fields, i just have option to create nex index pattern and that is it.
Let me know should i open new thread, or we can continue to do it here.
Best regards

If you want two geo_point fields you have to have a mapping template. The thread I linked to before should help you with that.

Thank you for your answer. Now i am investigating how data are imported to elasticsearch.Now i have after each field new one with .keyword at the end.When i import from Kibana i just got 50 fields, like i should have.Also, no geoip available fields for geohash, so let me know should i move to new question to solve problem by problem?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.