Hi,
I'm very new to Elastic (and Ansible) and currently trying to store Ansible Facts in Elasticsearch by POST-ing the JSON-results from Ansible to Elastic. I keep running in to 'mapper .. of different type'-exceptions.
I've made sure i have a clean sheet:
curl -XDELETE https://example.tld/facts
Posting the Ansible facts to this empty(!) ES-index gives me:
illegal_argument_exception
mapper [ansible_python.version_info] of different type, current_type [text], merged_type [long]
I've read about dynamic templates and enabled a mapping from [long] to [string] like so:
{
"mappings": {
"_default_": {
"dynamic_templates": [
{
"rule1": {
"mapping": {
"type": "string"
},
"match_mapping_type": "long"
}
}
]
}
}
}
This works, i just don't understand why it is necessary.
If i then add my 'custom facts' to the JSON, it too needs to be specifically formatted.
For example, i include the versions of packages installed in my custom facts, the JSON looks like this:
"ansible_local": {
"updates_facter": {
"Packages": {
"Installed": {
"acl": "2.2.52-3"
},
ES does not like that, complains about 'acl' key and expecting an Object not a 'Concrete Value'.
If i rewrite that JSON to look like this, it works fine:
"ansible_local": {
"updates_facter": {
"Packages": {
"Installed": {
"acl": {
"Version": "2.2.52-3"
},
Can someone explain me why this is so hard, why there seems to be some automatic assumption of what a value in JSON is and how i can work around that? Am i doing stuff wrong? Am i the only one trying to store Ansible facts in ES? Is this a dumb thing to do? Please share your views.