Feed Data to Elastic Serach Via Logstash using CSV

Hi Team ,

I am using Logstast To read the data from CSV file and Output to Elastic search .

Below is Conf File :

input {
file {
path => "/opt/app/data/TC_Export_CSV.csv"
start_position => "beginning"
}
}

filter {
csv {
columns => ["ColumnA", "ColumnB", "ColumnC", "ColumnD"]
separator => ","
}
}

output {
elasticsearch {
index=>tc_export_csv
}
stdout {}
}

Running Command

bin/logstash -f /opt/app/data/TC_Export.conf

bin/logstash -f /opt/app/nft/data/TC_Export.conf
Settings: Default filter workers: 1
Logstash startup completed

After That nothing happens.
Conf :
logstash-2.1.0
elasticsearch-2.1.1
kibana-4.3.1-linux-x64

Sample CSv File :

Core,Run,123456,987654
NonCore,No Run,225588,996633

Can you Please Help.

The file input is waiting for more data to be appended to TC_Export_CSV.csv. Because the file has seen before, start_position => "beginning" isn't effective. To make sure you start reading the file from the beginning, set sincedb_path => "/dev/null" as well. See the file input documentation for more information about sincedb.

Thanks Mag ,

Include the statement in Input .
Now Logstash is not getting shutdown .

it display the records in csv and then nothing is happening .

By Running :
bin/logstash -f /opt/app/data/TC_Export.conf

Again, the file input is waiting for more data to be appended to TC_Export_CSV.csv. Logstash does not shut down just because it's done with the file. For batch processing of files you might find the stdin input more useful. Alternatively, use the method described in http://stackoverflow.com/a/33146636/414355 to detect when a file has been processed fully.