Hello,
Is there a way to configure the output to csv file to include all fields in the index by default so that each one does not have to be manually typed out?
Thanks,
Eric
Hello,
Is there a way to configure the output to csv file to include all fields in the index by default so that each one does not have to be manually typed out?
Thanks,
Eric
Fields is a required option. It does not support %{[fieldname]} references but it will support environment variable substitution. However, it will treat a substituted environment variable as a string, even if it looks like an array. So the answer is no.
Except you can have logstash tell you what to type, which reduces it to copy and paste. Run a line of your data through this...
filter {
ruby {
code => '
csvFields = []
event.to_hash.each { |k, v|
csvFields << k
}
event.set("[csvFields]", csvFields.to_s)
'
}
}
output { stdout { codec => plain { format => "%{csvFields}
" } } }
Hi Badger,
Thanks for the tip! I tried running what you gave me, but I receive a result I was not expecting. The following is my conf file and output screenshot:
input{
elasticsearch{
hosts => ["10.100.1.2:9200"]
index => "elastiflow-*"
query => '{"query": { "bool": { "must": {"match_all": {}}}}}'
}
}
filter{
ruby{
code => '
csvFields = []
event.to_hash.each {|k,v|
csvFields << k
}
even.set("[csvFields]", csvFields.to_s)
'
}
}
output{
stdout {
codec => plain {
format => "%{csvFields}"
}
}
}
Result:
Do you know what went wrong?
Thanks,
Eric
That should be event.set. And you do want to use the newline that I embedded in the format of the codec.
Hi Badger,
I just made those corrections and it worked! Thanks for all your help.
Eric
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.