Missing fields in Kibana

Hello,
After I imported my CSV data in ES and when I search for my fields in devtools I couldn't find them all.
This is my conf file :

input{
	file{
	  path => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv"
      start_position => "beginning"
	}
}
filter{
	csv {
        columns => [ "Message",
                     "Time",
                     "Distance",
                     "Longitude",
                     "Latitude",
                     "NemoEvent_GPRS_DataConnectionSuccess_DAC",
                     "NemoEvent_GPRS_DataConnectionAttempt_DAA",
                     "NemoEvent_GPRS_DataDisconnect_DAD" 
                    ]
        separator => ","    

              
     }
        mutate {convert => ["Longitude", "float"]} 
	    mutate {convert => ["Latitude", "float"]} 
	    mutate {convert => ["NemoEvent_GPRS_DataConnectionSuccess_DAC", "integer"]} 
	    mutate {convert => ["NemoEvent_GPRS_DataConnectionAttempt_DAA", "integer"]} 
	    mutate {convert => ["NemoEvent_GPRS_DataDisconnect_DAD", "integer"]}  

}
output{
	elasticsearch { 
	    action => "index"
        hosts => ["http://localhost:9200/"] 
        index => "data-index-1"
        document_type => "data"
  }
  stdout { }
}

This is what I typed in the dev tools :

GET /data-index-1
{
  "query": {
        "match_all": {}
    }
}

And I only get : ( "Message", "Time", "Distance", "Longitude")

{
  "data-index-1" : {
    "aliases" : { },
    "mappings" : {
      "properties" : {
        "@timestamp" : {
          "type" : "date"
        },
        "@version" : {
          "type" : "text",
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        },
        "Distance" : {
          "type" : "text",
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        },
        "Longitude" : {
          "type" : "float"
        },
        "Message" : {
          "type" : "text",
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        },
        "Time" : {
          "type" : "text",
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        },
        "host" : {
          "type" : "text",
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        },
        "message" : {
          "type" : "text",
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        },
        "path" : {
          "type" : "text",
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        }
      }
    },
    "settings" : {
      "index" : {
        "creation_date" : "1586821638734",
        "number_of_shards" : "1",
        "number_of_replicas" : "1",
        "uuid" : "H4ZccR5BSFWyeD-aKzCgTA",
        "version" : {
          "created" : "7060299"
        },
        "provided_name" : "data-index-1"
      }
    }
  }
}

The other fields not exist.
Could anyone tell me where is the problem and how I could solve it please?

Hello @inchirah

I would suggest taking a look to the stdout output (even better, use stdout { codec => rubydebug }) to check if the fields are correctly extracted from the csvfile.
One or 2 samples would help to understand the problem.

The mappings do not contain the NemoEvent* fields.

I don't know if it is a typo, but you should use:

GET /data-index-1/_search
{
  "query": {
        "match_all": {}
    }
}

As a side note: if you're interested in using the Latitude and Longitude fields as geo points, please use an Index Template and prepare them in the correct format to be indexed as such (see documentation).

Thank you for your answer @Luca_Belluccini,
I did like you say to get all my fields in kibana , but I still have the same problem even when I use stdout { codec => rubydebug } I got this :

C:\elastic_stack\logstash-7.6.2>.\bin\logstash -f C:\Users\Asus\Dropbox\PFE_part2\data_logstash_configuration.conf
Sending Logstash logs to C:/elastic_stack/logstash-7.6.2/logs which is now configured via log4j2.properties
[2020-04-15T18:38:43,893][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-04-15T18:38:44,240][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.6.2"}
[2020-04-15T18:38:47,121][INFO ][org.reflections.Reflections] Reflections took 55 ms to scan 1 urls, producing 20 keys and 40 values
[2020-04-15T18:38:48,908][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch action=>"index", index=>"data-index-1", id=>"41476708e9ae7313737707a531bc921b19e348e444131550b9d0c5b20b53d35a", hosts=>[http://localhost:9200/], document_type=>"data", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_cfab4b7f-093a-45a6-b9df-8431d118db22", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_rollover_alias=>"logstash", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2020-04-15T18:38:51,832][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-04-15T18:38:52,073][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-04-15T18:38:52,243][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-04-15T18:38:52,250][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-04-15T18:38:52,398][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200/"]}
[2020-04-15T18:38:52,502][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-04-15T18:38:52,556][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been created for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2020-04-15T18:38:52,563][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["C:/Users/Asus/Dropbox/PFE_part2/data_logstash_configuration.conf"], :thread=>"#<Thread:0x52e202a9 run>"}
[2020-04-15T18:38:52,693][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-04-15T18:38:55,106][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/elastic_stack/logstash-7.6.2/data/plugins/inputs/file/.sincedb_0f3dac23b6abeb82f67df9f70c4a785f", :path=>["C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv"]}
[2020-04-15T18:38:55,146][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-04-15T18:38:55,204][INFO ][filewatch.observingtail  ][main] START, creating Discoverer, Watch with file and sincedb collections
[2020-04-15T18:38:55,236][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-04-15T18:38:55,827][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

and when I type :

GET /data-index-1/_search
{
  "query": {
        "match_all": {}
    }
}

I only get (Message,Time, Distance, Longitude )
Could you please help me to solve this problem ?

I cannot see any event in the logs containing your CSV data and I cannot see any major problem in the logs you've shared.

Could you please test the following pipeline and share the logs?

input {
  file {
    path => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv"
    start_position => "beginning"
    sincedb_path => "NUL" # this will force the file to be read 
  }
}
filter {
  csv {
    columns => [ "Message",
                     "Time",
                     "Distance",
                     "Longitude",
                     "Latitude",
                     "NemoEvent_GPRS_DataConnectionSuccess_DAC",
                     "NemoEvent_GPRS_DataConnectionAttempt_DAA",
                     "NemoEvent_GPRS_DataDisconnect_DAD" 
    ]
    separator => ","    
    convert => {
			"Longitude" => "float"
			"Latitude" => "float"
			"NemoEvent_GPRS_DataConnectionSuccess_DAC" => "integer"
			"NemoEvent_GPRS_DataConnectionAttempt_DAA" => "integer"
			"NemoEvent_GPRS_DataDisconnect_DAD" => "integer"
    }
  }
}
output {
    stdout { codec => rubydebug }
}

You can run it with:

.\bin\logstash -f C:\Users\Asus\Dropbox\PFE_part2\data_logstash_configuration.conf --log.level=debug

A sample of the first 3 lines of the CSV file might help.

Thank you for the reply @Luca_Belluccini.
I get this when I run your command :

      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;1\\r",
       "message" => "76710;20/f\\x82vr/19 18:31:26;0,665122;-3,66247;5,49975;0;0;1\\r",
    "@timestamp" => 2020-04-17T02:09:30.233Z
}
[2020-04-17T03:09:32,290][DEBUG][logstash.filters.csv     ][main] Event after csv filter {:event=>#<LogStash::Event:0x5b45191a>}
{
       "Message" => "76737;20/f\\x82vr/19 18:31:29;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "76737;20/f\\x82vr/19 18:31:29;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.240Z
}
[2020-04-17T03:09:32,297][DEBUG][logstash.filters.csv     ][main] Running csv filter {:event=>#<LogStash::Event:0x4c76300f>}
{
       "Message" => "76752;20/f\\x82vr/19 18:31:32;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "76752;20/f\\x82vr/19 18:31:32;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.253Z
}
[2020-04-17T03:09:32,297][WARN ][logstash.codecs.plain    ][main] Received an event that has a different character encoding than you configured. {:text=>"84434;20/f\\x82vr/19 18:35:39;1,442463;-3,66247;5,49975;0;0;0\\r", :expected_charset=>"UTF-8"}
{
       "Message" => "76876;20/f\\x82vr/19 18:31:35;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "76876;20/f\\x82vr/19 18:31:35;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.267Z
}
{
       "Message" => "77022;20/f\\x82vr/19 18:31:38;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "77022;20/f\\x82vr/19 18:31:38;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.277Z
}
[2020-04-17T03:09:32,299][DEBUG][logstash.inputs.file     ][main] Received line {:path=>"C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv", :text=>"84466;20/f\x82vr/19 18:35:40;1,442463;-3,66247;5,49975;0;0;0\r"}
{
       "Message" => "77125;20/f\\x82vr/19 18:31:41;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "77125;20/f\\x82vr/19 18:31:41;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.292Z
}
[2020-04-17T03:09:32,299][DEBUG][logstash.filters.csv     ][main] Event after csv filter {:event=>#<LogStash::Event:0x4c76300f>}
{
       "Message" => "77135;20/f\\x82vr/19 18:31:44;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "77135;20/f\\x82vr/19 18:31:44;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.318Z
}
[2020-04-17T03:09:32,300][WARN ][logstash.codecs.plain    ][main] Received an event that has a different character encoding than you configured. {:text=>"84466;20/f\\x82vr/19 18:35:40;1,442463;-3,66247;5,49975;0;0;0\\r", :expected_charset=>"UTF-8"}
{
       "Message" => "77212;20/f\\x82vr/19 18:31:47;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;1;0\\r",
       "message" => "77212;20/f\\x82vr/19 18:31:47;0,665122;-3,66247;5,49975;0;1;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.332Z
}
[2020-04-17T03:09:32,300][DEBUG][logstash.filters.csv     ][main] Running csv filter {:event=>#<LogStash::Event:0x7a413a09>}
{
       "Message" => "77380;20/f\\x82vr/19 18:31:50;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "77380;20/f\\x82vr/19 18:31:50;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.344Z
}
[2020-04-17T03:09:32,301][DEBUG][logstash.inputs.file     ][main] Received line {:path=>"C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv", :text=>"84514;20/f\x82vr/19 18:35:41;1,442463;-3,66247;5,49975;0;0;0\r"}
{
       "Message" => "77427;20/f\\x82vr/19 18:31:53;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "77427;20/f\\x82vr/19 18:31:53;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.369Z
}
[2020-04-17T03:09:32,303][DEBUG][logstash.filters.csv     ][main] Event after csv filter {:event=>#<LogStash::Event:0x7a413a09>}
[2020-04-17T03:09:32,303][WARN ][logstash.codecs.plain    ][main] Received an event that has a different character encoding than you configured. {:text=>"84514;20/f\\x82vr/19 18:35:41;1,442463;-3,66247;5,49975;0;0;0\\r", :expected_charset=>"UTF-8"}
{
       "Message" => "77434;20/f\\x82vr/19 18:31:56;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "77434;20/f\\x82vr/19 18:31:56;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.378Z
}
[2020-04-17T03:09:32,309][DEBUG][logstash.filters.csv     ][main] Running csv filter {:event=>#<LogStash::Event:0x2aefdd50>}
{
       "Message" => "77558;20/f\\x82vr/19 18:31:59;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "77558;20/f\\x82vr/19 18:31:59;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.389Z
}
[2020-04-17T03:09:32,315][DEBUG][logstash.inputs.file     ][main] Received line {:path=>"C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv", :text=>"84572;20/f\x82vr/19 18:35:42;1,442463;-3,66247;5,49975;0;0;0\r"}
{
       "Message" => "77665;20/f\\x82vr/19 18:32:02;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "77665;20/f\\x82vr/19 18:32:02;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.400Z
}
[2020-04-17T03:09:32,317][DEBUG][logstash.filters.csv     ][main] Event after csv filter {:event=>#<LogStash::Event:0x2aefdd50>}
{
       "Message" => "77779;20/f\\x82vr/19 18:32:05;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "77779;20/f\\x82vr/19 18:32:05;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.414Z
}
{
       "Message" => "77931;20/f\\x82vr/19 18:32:08;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "77931;20/f\\x82vr/19 18:32:08;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.419Z
}
{
       "Message" => "78078;20/f\\x82vr/19 18:32:11;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "78078;20/f\\x82vr/19 18:32:11;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.451Z
}
[2020-04-17T03:09:32,317][WARN ][logstash.codecs.plain    ][main] Received an event that has a different character encoding than you configured. {:text=>"84572;20/f\\x82vr/19 18:35:42;1,442463;-3,66247;5,49975;0;0;0\\r", :expected_charset=>"UTF-8"}
[2020-04-17T03:09:32,317][DEBUG][logstash.filters.csv     ][main] Running csv filter {:event=>#<LogStash::Event:0x55e5fae7>}
{
       "Message" => "78211;20/f\\x82vr/19 18:32:14;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "78211;20/f\\x82vr/19 18:32:14;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.469Z
}
[2020-04-17T03:09:32,321][DEBUG][logstash.inputs.file     ][main] Received line {:path=>"C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv", :text=>"84622;20/f\x82vr/19 18:35:43;1,442463;-3,66247;5,49975;0;0;0\r"}

{
       "Message" => "78432;20/f\\x82vr/19 18:32:23;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "78432;20/f\\x82vr/19 18:32:23;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.538Z
}
{
       "Message" => "78475;20/f\\x82vr/19 18:32:26;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "78475;20/f\\x82vr/19 18:32:26;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.562Z
}
[2020-04-17T03:09:32,333][DEBUG][logstash.inputs.file     ][main] Received line {:path=>"C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv", :text=>"84642;20/f\x82vr/19 18:35:44;1,664853;-3,66247;5,49975;0;0;0\r"}
{
       "Message" => "78491;20/f\\x82vr/19 18:32:29;0",
          "host" => "DESKTOP-34IP2OB",
          "Time" => "665122;-3",
          "path" => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv",
      "@version" => "1",
      "Distance" => "66247;5",
     "Longitude" => "49975;0;0;0\\r",
       "message" => "78491;20/f\\x82vr/19 18:32:29;0,665122;-3,66247;5,49975;0;0;0\\r",
    "@timestamp" => 2020-04-17T02:09:30.574Z
}


[2020-04-17T03:09:32,334][DEBUG][logstash.filters.csv     ][main] Running csv filter {:event=>#<LogStash::Event:0x2379db92>}
[2020-04-17T03:09:32,336][DEBUG][logstash.inputs.file     ][main] Received line {:path=>"C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv", :text=>"84646;20/f\x82vr/19 18:35:45;1,664853;-3,66247;5,49975;0;0;0\r"}
....

I think we still have the same problem, again not all the fields loaded, also I have a question If you may I. You didn't specify the name of the index in the log file, so how I will find the data in Kibana without the index name ??

Hello,
My pipeline doesn't write anything to Elasticsearch.
It just writes data in the logs to check what is going parsed by the pipeline to troubleshoot your issues.

I see some problems:

  1. Logstash expects the input file to be encoded in UTF-8, but the csv file is not UTF-8 (see the Time field, as ti contains 20/f\\x82vr/19 18:31:32). What is the encoding of the csv file?
    If you know it, you need to specify it in the file input.
    For example codec => plain { charset => "cp1252" } to specify the cp1252 codec.
    To know the list of supported codecs, see here.
  2. The separator in the csv filter should be ; - not ,

Retry with the following and provide again the logs and if possible 1 or 2 lines of the original csv file.

input {
  file {
    path => "C:/Users/Asus/Dropbox/PFE_part2/MOOV_ALEPE_Data.csv"
    start_position => "beginning"
    sincedb_path => "NUL" # this will force the file to be read
    codec => plain { charset => "CP1252" }
  }
}
filter {
  csv {
    columns => [ "Message",
                     "Time",
                     "Distance",
                     "Longitude",
                     "Latitude",
                     "NemoEvent_GPRS_DataConnectionSuccess_DAC",
                     "NemoEvent_GPRS_DataConnectionAttempt_DAA",
                     "NemoEvent_GPRS_DataDisconnect_DAD" 
    ]
    separator => ";"    
    convert => {
			"Longitude" => "float"
			"Latitude" => "float"
			"NemoEvent_GPRS_DataConnectionSuccess_DAC" => "integer"
			"NemoEvent_GPRS_DataConnectionAttempt_DAA" => "integer"
			"NemoEvent_GPRS_DataDisconnect_DAD" => "integer"
    }
  }
}
output {
    stdout { codec => rubydebug }
}
1 Like

Hello @Luca_Belluccini.
With your modifications every field takes her value correctly.
Thank you so much. :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.