Data can not be indexed

I am trying to play with toy data example from the book. I can not load data because of the following problem.

[2018-06-26T11:58:16,824][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_in
dex=>"accidents-2013", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x2db31fdc>], :response=>{"index"=>{"_index"=>"accidents-2013", "_type"=>"doc", "_
id"=>"yWLRPGQBGV7OARY3Aw30", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Rejecting mapping update to [accidents-2013] as the 
final mapping would have more than 1 type: [doc, accident]"}}}}
[2018-06-26T11:58:16,824][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_in
dex=>"accidents-2012", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x6b51130b>], :response=>{"index"=>{"_index"=>"accidents-2012", "_type"=>"doc", "_
id"=>"ymLRPGQBGV7OARY3Aw30", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Rejecting mapping update to [accidents-2012] as the 
final mapping would have more than 1 type: [doc, accident]"}}}}
[2018-06-26T11:58:16,824][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_in
dex=>"accidents-2012", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0xa63a0ee>], :response=>{"index"=>{"_index"=>"accidents-2012", "_type"=>"doc", "_i
d"=>"y2LRPGQBGV7OARY3Aw30", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Rejecting mapping update to [accidents-2012] as the f
inal mapping would have more than 1 type: [doc, accident]"}}}}

refer to the link relating the issue

Indices created in 6.x are only allowed to have one document type per index. Your logstash configuration is trying to create two types: doc and accident. You need to change it so there is only one type for both data sources, or put each data source in their own index.

More details here:

I do not think my data has two types.

Could you kindly check my filter and input output method?
First one is my data and the second is json type of data
Third picture is my input method and the last is filter.

Did you print it then took a picture? Unbelievable!

Please don't post images of text as they are hardly readable and not searchable.

Instead paste the text and format it with </> icon. Check the preview window.

1 Like

Hello, here is a code.
Actually, that was the book Learning Kibana 5.0 from Bahaaldine Azami from Elastic Search.
Maybe different version (5.0 for book and 6.3 for my local machine) prevent me from following the example. However, help me do the right job for 6.3 with this example. Thank you!!

20/04/2012 16:05,20/04/2012,16:05,75,111,"172, RUE DE LA
ROQUETTE",,1_75111_10314,,"172, RUE DE LA ROQUETTE, 75011 Paris",RUE
AWAY,,,Cond,Injured,RESPONSIBLE,,,,,,,,,,"172, RUE DE LA ROQUETTE, 75011

 "Address": "172, RUE DE LA ROQUETTE",
 "Zip code": null,
 "Dept": "75",
 "Person 2 Tag": null,
 "Segment": null,
 "Corner": "1_75111_10314",
 "Person 1 Category": "Cond",
 "involvedCount": "2",
 "Person 4 Cat": null,
 "season": "spring",
 "periodOfDay": "afternoon",
 "Person 3 Tag": null,
 "timestamp": "20/04/2012 16:05",
 "Com": "111",
 "Person 2 Category": null,
 "Person Tag": "RESPONSIBLE",
 "Vehicle 2 Description": "Car",
 "Hour": "16:05",
 "Vehicle 3 Description": null,
 "Person 3 Cat": null,
 "Address2": "RUE MERLIN",
 "Address1": "172, RUE DE LA ROQUETTE, 75011 Paris",
 "Person 4 Tag": null,
 "Date": "20/04/2012",
"Vehicle 2": "RUN AWAY",
 "Vehicle 3": null,
 "Vehicle 1": "RESPONSIBLE",
 "Vehicle 1 description": "Motor Scooter",
 "fullAddress": "172, RUE DE LA ROQUETTE, 75011 Paris",
 "Person 2 Status": null,
 "location": {
 "lon": "2.3862735",
 "lat": "48.8591106"
 "Person 4 Status": null,
 "Person 1 Status": "Injured",
 "Person 3 Status": null

#input method

input {
 file {
 path => "/path/to/accidents/files/directory/accident*"
 type => "accident"
 start_position => "beginning"

#output method

output {
 elasticsearch {
 action => "index"
 hosts => "localhost:9200"
 index => "accidents-%{+YYYY}"
 user => "elastic"
 password => "changeme"
 template => "/path_to_template/template.json"
 template_overwrite => true

finally, filter

filter {
 csv {
 separator => ","
 columns => ["timestamp","Date","Hour","Dept","Com","Address","Zip
code","Corner","Segment","Address1","Address2","Vehicle 1
description","Vehicle 1","Vehicle 2 Description","Vehicle 2","Vehicle 3
Description","Vehicle 3","Person 1 Category","Person 1 Status","Person
Tag","Person 2 Category","Person 2 Status","Person 2 Tag","Person 3
Cat","Person 3 Status","Person 3 Tag","Person 4 Cat","Person 4
Status","Person 4
 if ([Corner] == "Corner") {
 drop { }
 date {
 match => [ "timestamp", "dd/MM/YYYY HH:mm" ]
 target => "@timestamp"
 locale => "fr"
 timezone => "Europe/Paris"
 mutate {
 convert => [ "latitude", "float" ]
 convert => [ "longitude","float" ]
 rename => [ "longitude", "[location][lon]", "latitude",
"[location][lat]" ]

Please format your code, logs or configuration files using </> icon as explained in this guide and not the citation button. It will make your post more readable.

thank you.

Let me know anything you need more information to get topic solved.

Great. Would be even more readable if you correctly indent your code. The preview window is also helpful.

tidy up. Could you see the problem here regarding type issue?

You need to change the doc type in elasticsearch output. See

I'm moving your post to #logstash

I deleted type for Kibana 6.0 and tried again.
Now I have this result and still not able to import data from the csv file.
Could you help me?

input {
 file {
 path => "/path/to/accidents/files/directory/accident*"

**Deleted this row type => "accident"**

   start_position => "beginning"

And this is now a new log

 C:\Program Files\logstash-6.3.0\bin>logstash -f csv_to_es.conf
    Sending Logstash's logs to C:/Program Files/logstash-6.3.0/logs which is now configured via
    [2018-07-12T18:10:46,610][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
    [2018-07-12T18:10:47,109][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.3.0"}
    [2018-07-12T18:10:50,689][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
    [2018-07-12T18:10:51,067][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@localhost:9200/]}}
    [2018-07-12T18:10:51,085][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://elastic:xxxxxx@localhost:9200/, :path=>"/"}
    [2018-07-12T18:10:51,301][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@localhost:9200/"}
    [2018-07-12T18:10:51,363][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
    [2018-07-12T18:10:51,367][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
    [2018-07-12T18:10:51,379][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>"/Program Files/logstash-6.3.0/config/template.json"}
    [2018-07-12T18:10:51,393][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"accident*", "mappings"=>{"accident"=>{"properties"=>{"location"=>{"type"=>"geo_point"}, "involvedCount"=>{"type"=>"double"}}}}}}
    [2018-07-12T18:10:51,423][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash
    [2018-07-12T18:10:51,496][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
    [2018-07-12T18:10:52,674][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x31ac7ad0 run>"}
    [2018-07-12T18:10:52,750][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
    [2018-07-12T18:10:53,112][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

And, This is a Dev tool command in Kibana.
Other settings are same as prevous thread. Thank you

GET accident*/_count
  "count": 0,
  "_shards": {
    "total": 0,
    "successful": 0,
    "skipped": 0,
    "failed": 0

@Yong_Rhee, the other "type" is being set in the template

template => "/path_to_template/template.json"

I was getting the same error in the logs

  "template" : "accident*",
  "mappings" : {
    "accident": {
      "properties": {
        "location": { "type": "geo_point" },            < ---------  
        "involvedCount": { "type": "double" }      < ---------

comment them out and try to import the data again

delete the indices and try again,

output after

GET accident*/_count 

  "count": 13629,  < ---- 
  "_shards": {
    "total": 10,
    "successful": 10,
    "skipped": 0,
    "failed": 0

I am on the latest 6.x version, good luck with the tutorial

1 Like

Please refer to ''

  "index_patterns" : "accident*",
  "mappings" : {
    "doc": {         <------- rename a 'accident' type to 'doc' 
      "properties" : {
        "location": { "type": "geo_point" },
        "involvedCount": { "type": "double" }
1 Like

Thank you. Very helpful!!!

than you too

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.