Good day!
I'm a newbie in ELK and got stuck on the task that seemed to be a simple one on the first sight but that is not how it turned out.
What am I trying to do:
I have a table with test data in MS SQL Server and I want to setup data transfer from it via Logstash into Elasticsearch. I have successfully performed this task and data have reached the desired index in ES. The more we get the more we want. And I decided to setup explicit mappings for columns to make sure that data will be transferred from source table into ES index in desired format. To do this, I have created companies_list.mapping file with the following content:
{
"companies_list": {
"properties": {
"id": {
"index": "not_analyzed",
"type": "string"
},
"number_of_employees": {
"type": "long"
},
"name": {
"index": "analyzed",
"type": "string"
},
"description": {
"index": "analyzed",
"type": "string"
},
"homepage": {
"index": "not_analyzed",
"type": "string"
},
"location": {
"geohash": true,
"type": "geo_point"
}
}
}
}
And trying to upload mappings from this file into ES using cURL. Sequence of my actions:
a. Delete all indicies in ES with command:
curl -X DELETE "http://localhost:9220/_all"
Result:
{"acknowledged":true}
b. Create index in ES:
curl -X PUT http://localhost:9220/companies_list
Result:
{"acknowledged":true}
c. Trying to PUT mappings from file:
curl -X PUT "http://localhost:9220/companies_list/_mapping/companies_list" -d "D:\ELK\mappings\companies_list.mapping"
Result:
{"error":{"root_cause":[{"type":"not_x_content_exception","reason":"Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes"}],"type":"not_x_content_exception","reason":"Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes"},"status":500}
I have googled this error but have not found the answer. Any suggestions on this?