Logstash is indexing the last line of my csv file in elasticsearch

Holà,

I'm new in Elasticsearch,
today i'm trying to index data from a csv file.
I learn from the net that i can use Logstash to do this..

this is my conf file:

input {  
      file {
          path => "C:\plus\csv_16548461567_vp.csv"
          type => "Sample"
          start_position => "beginning"
      }
}

filter {  
    csv {
        columns => ["Numero de commande","Type revendeur","Enseigne","Chef de secteur","compte","Nom du Revendeur","Campagne","Type de client","Profil de client","Nom du client","Date","Mois","Total Avant RemiseHt","Total Avant RemiseTTC","Remise Ht","Remise TTC","Remise Remboursee Ht","Remise Remboursee Nikon TTC","TVA"]
        separator => ";"
    }
}

output {  
    elasticsearch {
        action => "index"
        host => "localhost"
        index => "financier-v2"
        document_id => "%{property_id}"
    }
}

I can't figure out why i have only one data in Elasticsearch??
I used Kibana to visualise data.. but i see only the last line of my csv file.

How can i fix this??

Thanks

It looks like you are trying to set the document_id based a field that does not exist ('property_id'), meaning that the same ID might be used for all lines in the file, resulting in the same document getting updated over and over. What is the _id of the document in Kibana?

Thanks for the quick reply,
i added this line:

document_id => "%{property_id}"

I thougnt that it can help for structuring my data..
Because when i remove it, logstash indexes my data without respecting the columns that i mention in the conf file...