I am developing a JavaScript application (Backend in Node.js and Frontend in Vue.js), wherein I want to send CSV data into logstash from the front end or UI, i.e, the user can choose any CSV file from local system and upload it into logstash, which will then be sent to elasticsearch for analysis.
I am not sure or aware of a way to do this. Is there a mechanism to achieve this? If so, can someone please help me out? I would really appreciate your help. Thanks in advance!
My Node.js server is running on port 3000 (localhost). I am assuming I don't need the local path as I will be posting from the server. So, I am unable to figure out how the logstash file should be configured. Could you please help me how I can configure the above config file to make a post request to logstash from my Node.js server?
Just add an http input. I think the port is the only option you need to configure.
I strongly suggest you replace the elasticsearch output with a stdout { codec => rubydebug }. Get the input and filters working as you want and only then worry about getting it into ES.
I am trying to send the CSV files from a file system. The logstash configuration monitors for the any files in the directory. Here is the logstash configuration,
As I said, comment out the elasticsearch output and make sure you're getting Logstash to process input files.
I think you need to change the filename pattern to use forward slashes instead of backslashes, i.e. E:/Downloads/Server/public/uploads/*.csv instead of E:\Downloads\Server\public\uploads*.csv.
Thanks a lot! It is working now. How do I parse the "Date" field so that it serves as a @timestamp for displaying visualizations on Kibana based on the date on X-axis?
PS E:\elk\logstash\bin> .\logstash -f csv_data.conf --config.test_and_exit
Sending Logstash's logs to E:/elk/logstash/logs which is now configured via log4j2.properties
[2018-08-27T12:34:58,327][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-08-27T12:34:58,769][FATAL][logstash.runner ] The given configuration is invalid. Reason: Expected one of #, } at line 14, column 34 (byte 294) after filter {
csv {
separator => ","
columns => ["Date","Blood Glucose","Blood Pressure","Weight","Height","BMI"]
}
date{
match => ["Date", "MM/dd/yy"]
[2018-08-27T12:34:58,779][ERROR][org.logstash.Logstash ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
Convert the fields to numbers, I guess? The csv filter's convert option can assist with that. Keep in mind that the existing mapping of a field in an ES index can't be changed, so after adjusting the configuration you need to reindex your data (e.g. by deleting the existing index).
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.