Hi,
My sample log file is like :
100,bbb,a,,,,{REGISTER,,0,0,68e805.ucc,kjj,0,-3,IEEE-802,,,0,[],}
300,p19w,c,,,,{INVITE,,[sip:U-f1,2.0+13,2.01,20,2.0,IEEE-802,,0,[[NA,1,59:15.4,982,88,0,0,215,93,0],[NA,1,59:15.7,47:00.4,0,113,0,0,1051,93,0]],xyz}
Based on 7 field value {REGISTER / {INVITE next fields should be change
As you suggested I have created config file:
input {
file {
path => [ "/xyz/sample.csv" ]
start_position => "beginning"
type => "abc"
sincedb_path => "/dev/null"
ignore_older => 0
}
}
filter {
if [message] =~ /^{REGISTER,/ {
csv {
columns => ["x1","x2","x3","x4","x5","x6"]
}
}
else if [message] =~ /^{INVITE,/ {
csv {
columns => ["x1","x2","x3","x4","x5",x6","x7","x8","x9","x10"]
}
}
}
output {
elasticsearch {
hosts => ["x.x.x.x:9200"]
index => "test-%{+YYYY.MM}"
}
stdout { codec => rubydebug }
}
Not getting any error but unable to parse logs. Its taking time and not getting any output:
[2017-06-27T14:04:33,294][INFO ][logstash.pipeline ] Pipeline main started
[2017-06-27T14:04:33,326][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
Please suggest...
Thanks!