Hi All.
I'm evaluating logstash to replace our river plugin.
Currently, I'm testing below topology.
- Create json data which has changed data and write on file using log4j.
- Using '%m%n' pattern to write only json value.
- Read file and send to elastic search using logstash.
We are using index for user clarification and type for elements at elastic search
Below is my json data in log file.
{"Index_Id":"ABCD", "Type":"Type1","_id":"199040",...}
When I test it using stdin with same data , index , document_id and type extract from json data successfully.
But, when I using log4j , any fields aren't extracted and below is log from logstash.
{
"message" => "{"Index_Id":"ABCD", "Type":"ABCD","_id":"199040",...}",
"@version" => "1",
"@timestamp" => "2015-06-01T18:49:20.024Z",
"type" => "%{Type}",
"host" => "dkim",
"path" => "/Users/dkim/search_log/data"
}
Below is my logstash configuration.
input { file {
codec => "json"
type => "%{Type}"
path => "/Users/dkim/search_log/*"
}
}
output {
elasticsearch { host => "127.0.0.1"
index => "%{Index_Id}"
document_id => "%{_id}"
protocol => http
port => 9200
}
stdout { codec => rubydebug }
}
Any body let me know what is problem and how could solve it ?
Thanks
Ducheol