Hey All
So I have a log file that has dynamic content depending on the specific machine it is running on.
For example:
0170517 00:07:20 name 00 2 3 6 0 5 name1 00 0 2 3 6 0 5 name2 00 0 2 3 6 0 5
0170517 00:07:30 name 00 0 2 3 6 0 5 name1 00 0 2 3 6 0 5 name2 00 0 2 3 6 0 5 name3 00 0 2 3 6 0 5
Is there a way for logstash to dynamically figure out the new fields using the same timestamp. Or should I write a script to change the log format? (that would be extra overhead)
Something like this:
0170517 00:07:20 name 00 0 2 3 6 0 5
0170517 00:07:20 name1 00 0 2 3 6 0 5
0170517 00:07:20 name2 00 0 2 3 6 0 5
0170517 00:07:30 name 00 0 2 3 6 0 5
0170517 00:07:30 name1 00 0 2 3 6 0 5
0170517 00:07:30 name2 00 0 2 3 6 0 5
0170517 00:07:30 name3 00 0 2 3 6 0 5
For example, this should work with the second format
input {
file {
path => "/var/log/name.log"
start_position => beginning
}
}
filter {
if ([message] =~ /^#/) {
drop{}
}
grok {
match => { "message" => "(?<timestamp>%{YEAR:year}%{MONTHNUM:month}%{MONTHDAY:day} %{HOUR:hour}:%{MINUTE:minute}:%{SECOND:second}) %{WORD:name} %{NUMBER:number1:int} %{NUMBER:number2:int} %{NUMBER:number3:int} %{NUMBER:number4:int} %{NUMBER:number5:int} %{NUMBER:number6:int} %{NUMBER:number7:int} "}
date {
match => ["timestamp", "yyyyMMdd HH:mm:ss"]
target => "@timestamp"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "logstash-naming"
}
}
Thanks
Any ideas or suggestions would be great.