I just done my first steps with filebeat and logstash.
I have some servers with log files with lines like:
value1|value2|value3|.....
Now i want them to be send via (1)FileBeat to (2)Logstash and then to (3)Elasticsearch.
(1) working: Filebeat sends the files to Logstash.
(2) trying to parse the file...
first I tryed filter "csv" but this doesnt work, because of some escaped " in content....
which way would be the best to filter this lines?
(3)Some lines are filtered correct, but the were not send to ES.
There I got no errors/logs or something like that
I strip out certain " , ; etc before I send the data to logstash. I have found that hidden non UTF-8 characters can be removed like this(at least on Mac and Linux):
iconv -f utf-8 -t utf-8 -c dirty.csv > Clean.csv
replacing is possible, but in my opinion it is not the solution.
Isn`t there an other way to explode the separator "|"?
CSV-Filter is not useful because it struggles with " in the file. Or is there a way to bypass this problem?
Hi Jordan!
Many thanks for your answer!
This works except one Problem.
The second mutate "add_field" does not work as expected. It creates the field "shop ID" but in ES there is only "[meassage[0]]" as content and not the value from this field.
Other Question:
In our file we have a timestamp in this format: 20170724073612 => YYYYMMDDHHIISS is it possible to convert this also in logstash, so we can use this field as timestamp in ES?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.