I'm currently using Logstash 2.1.1 to retrieve data from Elasticsearch 1.4.4 and put it on a CSV to be imported to a database. My problem is that there are lines where I only have commas (Like this: ",,,,,,,,,,").
Obviously when trying to import to a PostgreSQL it retrieves an error. is there any way for logstash to prevent this from happening?
I'm currently using the first suggestion @atira made without quotes like you said. It didn't return any error. I'll let it run and when it finishes I'll let you know if this configuration worked.
Ah okay. If your message field contains more than just commas, or no commas at all, then the conditional won't work.
All we know that the output produces lines with only commas. So the input produces an empty line? Though I wonder how.
If that's the case, modify the filter to this:
filter {
if [message] =~ /^$/ {
drop { }
}
}
This should delete all empty messages that come from the input.
Uncharted territory for me, maybe someone else will look here too.
Brainstorming mode.
It'd be good to know what events the input filter creates exactly.
Can you configure eg. a file output? I presume that would simply write out the events to a file without any transformation. Then we would know what the csv filter receives.
Right now I'm extracting everything I have on Index and then I'll try to find the positions where I have problems. Once I have any news I'll let you guys know
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.