Extract IP from multiple lines

Using ELK 5.6
Filebeat ships a log to Logstash that looks like this:
1.1.1.1
2.2.2.2
3.3.3.3
etc.
What would be the best way to extract each IP address and put it into its own field, like peer1, peer2, peer3, etc...? (The field would have to be created dynamically since the number of lines will change)
If I use %{IPV4:peer} grok, it only prints the first line:
{
"peer": [
[
"1.1.1.1"
]
]
}
Thank you for any tips.

You want all IP addresses in the same document in ES? Then you need to use Filebeat's multiline feature to join the lines in the file to a single event before shipping to Logstash.

So, if I have a log file (yes, same document in ES) with one line like this:
1.1.1.1,2.2.2.2,3.3.3.3
What would the grok filter look like? Considering fields would have to be created dynamically.
Thanks Magnus.

If you want to use a grok filter you'd have to enumerate all fields you want to capture,

%{IPV4:peer1},%{IPV4:peer2},...

or the equivalent with a csv filter (you don't need grok in this simple case) and if that's not desirable you could use a mutate filter's split option to turn the input string into an array that a ruby filter can read and dynamically turn into any number of fields.

Got it. Thank you very much for your help!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.