Q1: How do I configure my Logstash to output three values from a csv file in Elasticsearch:
The time to match @timestamp
Make the values searchable
To send the last two values from the first line as text. Eg. PhysicalDisk(0 C:) and Disk Read Bytes/sec
Q2: How do I distinct the filter for csv files. So test1.csv has a different filter than test2.csv
My current beats.conf of Logstash is configured like this:
Use a date filter to parse the timestamp field into @timestamp
Make the values searchable
They will be.
To send the last two values from the first line as text.
As part of every event picked up from that file? Sorry, that's not possible. You can send the header row as one event, but it won't remember those fields for the subsequent events.
How do I distinct the filter for csv files. So test1.csv has a different filter than test2.csv
You can e.g. set a custom field on the Filebeat end to indicate what kind of file an event comes from. Then use conditionals in your Logstash configuration to choose between different filteres.
Use a grok filter to extract the timestamp and the "1.7" string into their own fields. Use the date filter to parse the data in the timestamp field and store it in @timestamp.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.