Extract IP from multiple lines


#1

Using ELK 5.6
Filebeat ships a log to Logstash that looks like this:
1.1.1.1
2.2.2.2
3.3.3.3
etc.
What would be the best way to extract each IP address and put it into its own field, like peer1, peer2, peer3, etc...? (The field would have to be created dynamically since the number of lines will change)
If I use %{IPV4:peer} grok, it only prints the first line:
{
"peer": [
[
"1.1.1.1"
]
]
}
Thank you for any tips.


(Magnus Bäck) #2

You want all IP addresses in the same document in ES? Then you need to use Filebeat's multiline feature to join the lines in the file to a single event before shipping to Logstash.


#3

So, if I have a log file (yes, same document in ES) with one line like this:
1.1.1.1,2.2.2.2,3.3.3.3
What would the grok filter look like? Considering fields would have to be created dynamically.
Thanks Magnus.


(Magnus Bäck) #4

If you want to use a grok filter you'd have to enumerate all fields you want to capture,

%{IPV4:peer1},%{IPV4:peer2},...

or the equivalent with a csv filter (you don't need grok in this simple case) and if that's not desirable you could use a mutate filter's split option to turn the input string into an array that a ruby filter can read and dynamically turn into any number of fields.


#5

Got it. Thank you very much for your help!


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.