Logstash input beat issue

I getting an issue between filebeat and logstash that I can not uderstand what is gonig on, I need some help with that.
Let me tell you what is going on...
The filebeat setting is the following....

//
filebeat.inputs:

 - type: log
  enabled: true
  tags:
    - Vendas
  paths:
      - /home/vendas.csv
and the logstash.conf is the following ...

input {
  beats {
    port => 5044
    #codec => palin { charset => "ISO-8859-1"}
    #workers => 3
    #queue_size => 72000
    #receive_buffer_bytes => 31457280
  }
}

filter {
    if "Vendas" in [tags]{
     csv {
       separator => ","
       skip_empty_columns => true
       columns => ["nome","unidade","tipo","quantidade","valor","event3","event4"]
         }
       mutate {
          convert => {
          #"epoch_timestamplisted_action" => "integer"
          #"uniqueid" => "string"
          #"queue_name" => "integer"
          #"bridged_channel" => "string"
          #"event" => "string"
          "quantidade" => "integer"
          "valor" => "integer"
          "event3" => "integer"
         }
       }
    }
}

output {
  if "Vendas" in [tags]{
     elasticsearch {
     hosts => "localhost:9200"
     manage_template => false
     index => "%{[@metadata][beat]}-vendas"
     #document_type => "%{[@metadata][type]}"
     }
   }
}
 # for debug purpose of pipeline with command: ./logstash -f /etc/logstash/conf.d/logstash.conf
//

also the pipeline.yml is setting as the following ....

//
- pipeline.id: main
  pipeline.workers: 8
  pipeline.batch.size: 1000
  pipeline.batch.delay: 120
  path.config: "/etc/logstash/conf.d/*.conf"
  #codec: plain { charset => "ISO-8859-1" }
  # workers: 3
  #queue_size: 72000
  #receive_buffer_bytes: 31457280
  queue.type: persisted
//

Here is the issue the first content of the file comes with the following situation...

,*
11:40:04.423 Vanderleia_Souza,Unidade-1_1100,Portabilidade,1,37066,0,*
11:40:04.423 Rebeca_Xavier,Unidade-1_2100,Portabilidade,1,11706,0,*
11:40:04.423 Rayssa_Lima_Fernandes_de_Souza,Unidade-1_2100,Novos,1,4791,0,*
11:40:04.423 Rayssa_Lima_Fernandes_de_Souza,Unidade-1_2100,Portabilidade,1,9052,0,*
11:40:04.423 Victoria_Christina_Batista�,Unidade-1_2100,Novos,1,10071,0,*
11:40:04.424 Vitoria_Eberhardt_Machado_dos_Passos,Unidade-2_6100,Novos,1,2738,0,*
11:40:04.424 Vitoria_Eberhardt_Machado_dos_Passos,Unidade-2_6100,Novos,1,2965,0,*
11:40:04.424 Orestes_Novaes,Unidade-2_6100,Portabilidade,1,12724,0,*
11:40:04.424 Izadora_Elizabeth_Gama_dos_Santos,Unidade-2_5100,Portabilidade,1,13052,0,*
11:40:04.424 Lucas_Catania_Marques_De_Oliveira,Unidade-3_7100,Novos,1,9494,0,*
11:40:04.424 Lucas_Catania_Marques_De_Oliveira,Unidade-3_7100,Novos,1,9790,0,*
11:40:04.424 Lucas_Catania_Marques_De_Oliveira,Unidade-3_7100,Cartao,1,1822,0,*
11:40:04.425 Gabrielly_de_Lima_Barbosa,Unidade-3_7100,Novos,1,6662,0,*
11:40:04.425 Igor_De_Andrade_B._Mathias,Unidade-3_7100,Novos,1,11490,0,*
11:40:04.425 Igor_De_Andrade_B._Mathias,Unidade-3_7100,Novos,1,11478,0,*
11:40:04.425 Igor_De_Andrade_B._Mathias,Unidade-3_7100,Cartao,1,2139,0,*

before de first line must be the following ...

11:47:34.439 Quesia_Farias_da_Silva,Unidade-2_3100,Portabilidade,1,11946,0,*
11:47:34.439 Andre_Tavares_do_Nascimento,Unidade-2_3100,Portabilidade,1,11994,0,*
11:47:34.439 Tayna_Donofrio_Barbosa,Unidade-2_3100,Novos,1,735,0,*
11:47:34.439 Manoela_Lopes,Unidade-1_1100,Portabilidade,1,34839,0,*
11:47:34.440 Thaillyn_Tamires_da_Silva,Unidade-1_1100,Portabilidade,1,12594,0,*
11:47:34.440 Nayara_Brandao,Unidade-1_1100,Portabilidade,1,13787,0,*
11:47:34.440 Nayara_Brandao,Unidade-1_1100,Portabilidade,1,8274,0,*
11:47:34.440 Vanderleia_Souza,Unidade-1_1100,Portabilidade,1,14143,0,*

I don´t know why the data is beeing truncate...

IF i go to the same csv file at the filebeat host and edit it and leave there just the data that it not went to logstash host ..after save the file evey contemt goes to logstash perfectly.

So anyeone already have this situation ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.