I have a logstash conf file that gets data from kafka as input and then based on if the input contains a field it is supposed to pipe the input data to a script(unix of python)
For example-
input{
kafka {
zk_connect => '112.16.40.126:2181'
topic_id => 'READY_FOR_INDEX'
}
kafka {
zk_connect => '112.16.40.126:2181'
topic_id => 'INDEX_CSV'
}
}
output {
if ([csvLocation]) {
pipe{
command => "/home/test.sh"
}
}
stdout {codec => rubydebug}
}
The script reads the input as arguments i suppose
#!/bin/bash
while read LINE; do
echo ${LINE} # do something with it here
done
exit 0
But for some reason the script does not get any data. How do i get this working to get all the input data.
I was referring to https://www.elastic.co/guide/en/logstash/current/plugins-outputs-pipe.html
Thanks
Please show an example event (i.e. output from your stdout {codec => rubydebug} output). Is Logstash forking a child process that runs /home/test.sh? Is there anything in Logstash's log? You might want to increase the log level to see everything that's interesting.
@magnusbaeck Here us a sample stdout event.
{
"index" => "xxx",
"dimension_name" => "xxx",
"source" => "xxx",
"type" => "xxx",
"abc" => {
"test1" => {
"display_name" => "xxx",
"testid" => "5xxx"
},
"test2" => {
"display_name" => "xxx2",
"testid" => "6xxx"
}
},
"PROJECT_ID" => "xxx",
"tableName" => "xxx",
"csvLocation" => "abc.csv",
"@version" => "1",
"@timestamp" => "2017-01-19T21:24:21.775Z"
}
this is what i get when i run logstash in debug mode
Opening pipe {:command=>"/home/test.sh", :level=>:info}
Starting stale pipes cleanup cycle {:pipes=>{"python /home/test.sh"=>#<PipeWrapper:0x7150cc3 @pipe=#<IO:fd 454>, @active=true>}, :level=>:info}
But that's not the only output, right? Off the top of my head I don't know what's up here.
Sorry I didn't get you. I didn't post the actual output for privacy reasons.but its mostly structured that way.
Well, seeing more of the logs could be helpful. You can obfuscate sensitive parts if you like