How to upload json.file via logstash into elasticserch? What I did wrong?

json.file


[
  {
    "sku":"00000290",
"categories":"1_brands,1_restaurant_equipment,2_brand_rm_gastro,2_food_holding_and_warming_equipment,3_steam_heaters_and_buffets,4_bain_marie_heaters,categoryE7E7163",
"family":"TV",
"box_height_estimate_boolean":"1"
},
{
"sku":"00003165",
"categories":"1_brands,1_spare_parts,2_brand_rm_gastro,2_product_type,3_planetary_mixers_spare_parts_and_accessories,4_panetary_mixer_bowls,categoryE7E7163",
"family":"Blender",
"box_height_estimate_boolean":"1"
}
]

logstash:

input{
  file {
    type => "json"
    path => "C:/json-file.json" 
    start_position => "beginning"
    sincedb_path => "NUL"
  }	  
}
 filter {
      json {
        source => "message"
      }
    }

output{
  elasticsearch {
    hosts => "http://localhost:9200"
    index => "original-data"
    user => "elastic"
    password => "b==KITDnO6dxMapRJ7BH" 
  }
stdout{}
}

ERROR MESSAGE:
Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"original-data"}

Starting with version 8.x the elasticsearch output defaults the data_stream parameter as auto. Sounds like it thinks this data should be a data stream but the original-data is not setup that way in the cluster. Try setting data_stream to false (which was the default in 7.x). For example:

elasticsearch {
    hosts => "http://localhost:9200"
    index => "original-data"
    user => "elastic"
    password => "password_redacted"
    data_stream => false
  }

Also, if you want to try another option the "Upload File" integration in Kibana (supports files up to 100MB by default). It can handle both json and csv file types.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.