Lumberjack loosing grokked fields?


(Siddharth Trikha) #1

I am using logstash 1.4.1 on my client machine (where logs are present) and
server machine (where logstash parses events).

On client I read logs:

input {
file{
path => "/root/Desktop/Logstash-Input/**/*_log"
start_position => "beginning"
}
}

filter {
grok {

    match => ["path", 

"/root/Desktop/Logstash-Input/(?[^/]+)/(?[^/]+)/(?[\d]+.[\d]+.[\d]+)/(?.*)_log"]
}
}

output {

    lumberjack {
            hosts => ["192.168.105.71"]
            port => 4545
            ssl_certificate => "./logstash.pub"
}

    stdout { codec => rubydebug }

}

*Console: *

filter received {:event=>{"message"=>"2014-05-26T00:00:01+05:30 bxas1
crond[268]: (roooot) CMD (2014/05/31/server2/cron/log)", "@version"=>"1",
"@timestamp"=>"2014-07-11T09:14:28.740Z", "host"=>"cmd",
"path"=>"/root/Desktop/Logstash-Input/Server2/CronLog/2014.05.31/cron_log"},
:level=>:debug, :file=>"(eval)", :line=>"18"}{ "message" =>
"2014-05-26T00:00:01+05:30 bxas1 crond[268]: (roooot) CMD
(2014/05/31/server2/cron/log)", "@version" => "1", "@timestamp"
=> "2014-07-11T09:14:28.735Z", "host" => "cmd", "path"
=>
"/root/Desktop/Logstash-Input/Server2/CronLog/2014.05.31/cron_log",
"server" => "Server2", "logtype" => "CronLog", "logdate" =>
"2014.05.31", "logfilename" => "cron"}

On server:

input {
lumberjack {
port => 4545
ssl_certificate => "/etc/ssl/logstash.pub"
ssl_key => "/etc/ssl/logstash.key"
codec => "json"
}
}

filter {
if [server] == "Server2" and [logtype] == "CronLog" {

grok{
match => ["message", "........Pattern......"]        
add_tag => "server2-cronlog"
}   

}else if [server] == "Server2" and [logtype] == "AuthLog"{

grok{
match => ["message", ".......Pattern......"]
}
}

Server-Console:

filter received {:event=>{"message"=>"2014-07-11T09:29:59.730+0000 cmd
2014-05-26T00:00:01+05:30 bx920as1 crond[268]: (rorit) CMD
(2014/05/31/server2/cron/log)", "@version"=>"1",
"@timestamp"=>"2014-07-11T09:30:41.772Z"}, :level=>:debug, :file=>"(eval)",
:line=>"30"}output received
{:event=>{"message"=>"2014-07-11T09:29:59.730+0000 cmd
2014-05-26T00:00:01+05:30 bx920as1 crond[26388]: (rorit) CMD
(2014/05/31/server2/cron/log)", "@version"=>"1",
"@timestamp"=>"2014-07-11T09:30:41.772Z"}, :level=>:debug, :file=>"(eval)",
:line=>"100"}{ "message" => "2014-07-11T09:29:59.730+0000 cmd
2014-05-26T00:00:01+05:30 bx920as1 crond[268]: (rorit) CMD
(2014/05/31/server2/cron/log)", "@version" => "1", "@timestamp" =>
"2014-07-11T09:30:41.772Z"}

So as one can see the grokked fields at the client machine are lost after
shipping via lumberjack. Is this a bug??

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/136d308e-3f02-42cd-bf8d-0bcd9d665e74%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


(system) #2