Hello!
This topic may be a duplicate, but I couldn't find anything appropriate, so please bare with me on this one.
So anyway, how do I import a JSON file into elasticsearch from command line?
I was trying something along the lines of:
curl -X POST 'localhost:9200/sample_data/data/1?pretty' -H 'Content-type: application/json' --data-binary @sample.json
But i get a weird parsing error somehow :x
"type" : "mapper_parsing_exception",
"reason" : "failed to parse",
"caused_by" : {
"type" : "json_parse_exception",
"reason" : "Unexpected character (',' (code 44)): expected a value\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@5a0f3cad; line: 6, column: 3]"
}
Anyway, here's the corresponding JSON, if it helps:
{
"type":"train",
"name":"T01",
"state":"inactive",
"time":"2019-12-20 08:48:12"
},
{
"type":"train",
"name":"T02",
"state":"active",
"time":"2019-12-20 08:48:12"
}
It's very simple, and the format looks good to me... So yea, any help would be greatly appreciated
EDIT: I also tried with the following format (previous format was actually incorrect, silly me (was trying to copy a NDJSON format example))
[
{
"type":"train",
"name":"T01",
"state":"inactive",
"time":"2019-12-20 08:48:12"
},
{
"type":"train",
"name":"T02",
"state":"active",
"time":"2019-12-20 08:48:12"
}
]
but to no avail:
"type" : "mapper_parsing_exception",
"reason" : "failed to parse",
"caused_by" : {
"type" : "not_x_content_exception",
"reason" : "Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes"
}
Cordially,
Mox