Struggle in export the data from csv to elasticsearch

I have a csv file. It contains 13 columns and more than 150 000 rows.

When I use the following config, There is no output on the logstash.

input {
  file {
    path => "/salaries.csv"
    start_position => "beginning"
    type => "data"
  }
}

filter {
  csv{
    separator => ","
  }
}

output {
  stdout { 
    codec => rubydebug 
  }
}

would you please give me a basic solution for this case.

Logstash is tailing the file. Set sincedb_path to "/dev/null" if you want Logstash to unconditionally process a file from the top. The file input documentation explains how this works.

Thanks a lot Magnus. Now its work.
Now my question is When we import the data from csv file, How to mapping a schema? Because the all document types are stored as string format in elastic server.
i.e: The jdbc plugin automatically get the database column type.
In this case any predefined method? or where we can specify the datatype of an document in the config file?

The keyword here is "mappings". Either set the mappings when you create the index or use an index template to automatically apply at set of mappings for all new indexes that match a particular pattern.

Absent explicit mappings, ES will attempt to guess what data type each field has.

ok. thank you.