I'm trying to import a large JSON document in to Elasticsearch 5.1. A small section of the data looks like this:
[
{
"id": 1,
"region": "ca-central-1",
"eventName": "CreateRole",
"eventTime": "2016-02-04T03:41:19.000Z",
"userName": "email@group.com"
},
{
"id": 2,
"region": "ca-central-1",
"eventName": "AddRoleToInstanceProfile",
"eventTime": "2016-02-04T03:41:19.000Z",
"userName": "email@group.com"
},
{
"id": 3,
"region": "ca-central-1",
"eventName": "CreateInstanceProfile",
"eventTime": "2016-02-04T03:41:19.000Z",
"userName": "email@group.com"
},
{
"id": 4,
"region": "ca-central-1",
"eventName": "AttachGroupPolicy",
"eventTime": "2016-02-04T01:42:36.000Z",
"userName": "email@group.com"
},
{
"id": 5,
"region": "ca-central-1",
"eventName": "AttachGroupPolicy",
"eventTime": "2016-02-04T01:39:20.000Z",
"userName": "email@group.com"
}
]
I'd like to import the data without making any changes to the source data if possible, so I believe that rules out the _bulk command as I'd need to add additional details for each entry.
I've tried several different methods but have not had any luck. Am I wasting my time trying to import this document as-is?
I've tried:
curl -XPOST 'demo.ap-southeast-2.es.amazonaws.com/rea/test' --data-binary @Records.json
But that fails with an error:
{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"failed to parse"}],"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"not_x_content_exception","reason":"Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes"}},"status":400}
Thanks!