Error: "type" : "json_parse_exception"


(Nasos) #1

Hi all,

I am new in Elastic and I want to visualize data in Kibana from csv file.

the Data (kibana_test1.csv) are in the following form:

user_id,item_id,event_type,user_location,user_favorite_brand,item_brand
278673,234,purchase,City,Nokia,Samsung
278674,235,purchase,City,Nokia,Samsung
278675,236,purchase,City,Nokia,Samsung
278676,237,purchase,City,Nokia,Samsung
278677,238,purchase,City,Nokia,Samsung
278678,239,purchase,City,Nokia,Samsung
278679,240,purchase,City,Nokia,Samsung
278680,241,purchase,City,Nokia,Samsung
278681,242,purchase,City,Nokia,Samsung
278682,243,purchase,City,Nokia,Samsung
278683,244,purchase,City,Nokia,Samsung
278684,245,purchase,City,Nokia,Samsung

My logstash configuration file ( logstash-simple.conf) is:

input {
file {
path => "/home/nasos/Documents/kibana_test1.csv"
start_position => "beginning"}
}

filter {

csv {
separator => ","
columns => ["user_id","item_id","event_type","user_location","user_favorite_brand","item_brand"]
}
}

output {
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "data"
user => "elastic"
password => "My_password"
}
stdout { codec => rubydebug }
}

I have put the password (of elastic) that is generated from the command:
bin/x-pack/setup-passwords auto

I have created an index in Kibana- Dev Tools:

PUT /kibana_test1
{
"mappings": {
"doc": {
"properties": {
"user_id": {"type": "integer"},
"item_id": {"type": "integer"},
"event_type": {"type": "keyword"},
"user_location": {"type": "keyword"},
"user_favorite_brand": {"type": "keyword"},
"item_brand": {"type": "keyword"}
}
}
}
}

and I used

sudo ./logstash -f logstash-simple.conf --path.settings=/etc/logstash

and I got the following message:

Sending Logstash's logs to /var/log/logstash which is now configured via log4j2.properties

It seem that it works fine so far. Isn't it?

when I used the Elasticsearch bulk API to load the data sets;

curl -H 'Content-Type: application/x-ndjson' -XPOST -u elastic 'localhost:9200/kibana_test1/doc/_bulk?pretty' --data-binary @kibana_test1.csv

I got the following error:

{
"error" : {
"root_cause" : [
{
"type" : "json_parse_exception",
"reason" : "Unrecognized token 'user_id': was expecting ('true', 'false' or 'null')\n at [Source: org.elasticsearch.transport.netty4.ByteBufStreamInput@4df98756; line: 1, column: 9]"
}
],
"type" : "json_parse_exception",
"reason" : "Unrecognized token 'user_id': was expecting ('true', 'false' or 'null')\n at [Source: org.elasticsearch.transport.netty4.ByteBufStreamInput@4df98756; line: 1, column: 9]"
},
"status" : 500
}

Could you please help?
Thx


(Bhavya R M) #2

Hi,

Couple of things are happening here:

  1. This command works only for uploading a .json file and you are using to load .csv file:
curl -H 'Content-Type: application/x-ndjson' -XPOST -u elastic 'localhost:9200/kibana_test1/doc/_bulk?pretty' --data-binary @kibana_test1.csv
  1. You can ingest data into elasticsearch either using logstash or doing the bulk upload as the above command. You don't need to do both. If your logstash config is successful then that should be loading your data into Elasticsearch.
    Can you check http://localhost:9200/_cat/indices (you have to use your es ip) and then check if you have the data? I also think you can just do a bulk upload of your data by making sure your file is in .json format.

Thanks,
Bhavya


(Nasos) #3

Hello Bhavya,

thank you for your prompt response. I have some questions regarding the above comments:

  1. Shall I use Json-p format to load data using elasticsearch bulk API (curl -H .....)?
  2. Also, I used logstash to ingest data to elasticsearch. I see it on indices list ( "cat/indices")

yellow open kibana_test1 ySFDAm3UR0ymEyUXxEaBRQ 5 1 0 0 1.2kb 1.2kb

but I cannot see it on "Set Up Index Patterns" in Kibana. Shall I set up a mapping for the data in csv file?

Please find my logstash-plain.log above:

[2018-06-05T10:48:46,353][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-06-05T10:48:46,353][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-06-05T10:48:46,354][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-06-05T10:48:46,355][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "map$
[2018-06-05T10:48:46,357][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-06-05T10:48:46,386][ERROR][logstash.pipeline        ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x77ebf0fb @metric_events_out=org.jruby.proxy.org.logstash.instr$
[2018-06-05T10:48:46,387][ERROR][logstash.pipeline        ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{COSMOBILECOMMON} not defined>, :backtrace=>["/usr/sha$
[2018-06-05T10:48:46,390][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAct$
[2018-06-05T10:48:46,396][INFO ][logstash.inputs.metrics  ] Monitoring License OK
[2018-06-05T10:48:47,728][INFO ][logstash.pipeline        ] Pipeline has terminated {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x41591aa@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb$

Thank you.

BR,
Nasos


(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.