Thanks Constantin & warkolm.
Now, I can set index , type and id using below config.
input { stdin {
codec => "json"
type => "%{type}"
} }
output {
elasticsearch { host => "127.0.0.1"
index => "%{index}"
document_id => "%{_id}"
protocol => http
port => 9200 }
stdout { codec => rubydebug }
}
My main problem was I don't set "codec= "json" " at input clause.
So my whole json data treated as one string instead of individual values.
I have one more question.
It is same issue that I notice at river.
Is there any way just update fields which is included in input ?
Below is my test.
1st input : {"_id":"TEST","index":"TEST_INDEX","type":"TEST2","name":"UPDATE"}
2nd input : {"_id":"TEST","index":TEST_INDEX", "type":"TEST2","Description":TEST123"}
I expect to below result.
"_source":{"_id":"TEST","index":"TEST_INDEX","type":"TEST2","name":"UPDATE","Description":"TEST123","@version":"1","@timestamp":"2015-05-21T14:16:58.890Z","host":"dkim.local"}
But, actual response is below
{"_id":"TEST","index":"TEST_INDEX","type":"TEST2","Description":"TEST123","@version":"1","@timestamp":"2015-05-21T14:16:58.890Z","host":"dkim.local"}
As you see that, whole document is replace with next input.
I want to keep the fields which isn't included in next input.
Is there any way to keep the old fields and only update fields which are included in the input ?
Regards