Elasticsearch mapping - cannot index a field having name starting with a dot

Hi,

I am trying to send some JSON data into elasticsearch using logstash:

My logstash configuration looks like this: test.config

input{
	file{
			path => "/Users/bob/data/test.json"
			codec => json
			sincedb_path => "/dev/null"
			start_position => "beginning"
		}
}

filter{
	json{
		source => "student"
		target => "student"
	}

	mutate{
		convert => { 
			"name" => "string"
			"score" => "float"
			"address" => "string"
		}
	}
}

output{
	elasticsearch{
		hosts => "localhost:9200"
		index => "test"
		document_type => "student"
		manage_template => true
		template => "/Users/bob/data/index_templates/test_template.json"
		template_name => "test_template"
		template_overwrite => true
	}
	stdout { 
		codec => rubydebug 
	}
}

And the mapping template being used here is: test_template.json

{
    "index_patterns": "test",
    "settings" : {
        "number_of_shards" : 1,
        "number_of_replicas" : 0,
        "index" : {
            "query" : { "default_field" : "@words" }
        }
    },
    "mappings": {
        "student": { 
            "_source": { "enabled": true },
            "dynamic_templates": [
                {
                    "string_template" : { 
                        "match" : "*",
                        "mapping": { "type": "keyword", "index": true },
                        "match_mapping_type" : "string"
                     } 
                 }
             ],
             "properties" : {
                "name": {"type":"keyword", "index": true},
                "score": {"type": "float"},
                "address": {"type":"keyword", "index": true},
                "lastUpdated":{"type": "date", "format": "epoch_millis"},
                "firstUpdated": {"type": "date", "format": "epoch_millis"},
                "official":{
                    "type": "nested",
                    "properties": {
                        "suid": {"type": "keyword", "index": true},
                        "uploader": {
                            "type": "nested",
                            "properties": {
                                "AGS": {"type": "date", "format": "epoch_millis"},
                                "AGM": {"type": "date", "format": "epoch_millis"}
                            }
                        },
                        "rank": {"type": "integer"}
                    }
                },
                "files":{
                	"type": "nested",
                	"properties": {
                		"../bob/filename1": {
                			"type": "nested",
                			"properties":{
                				"name": {"type":"keyword", "index": true},
                				"signed": {"type":"boolean"},
                				"failure": {"type":"keyword", "index": true},
                				"version": {"type":"keyword", "index": true},
                				"checksum": {"type":"keyword", "index": true},
                				"signer": {"type":"keyword", "index": true},
                			}
                		},
                		"../bob/filename2": {
                			"type": "nested",
                			"properties":{
                				"name": {"type":"keyword", "index": true},
                				"signed": {"type":"boolean"},
                				"failure": {"type":"keyword", "index": true},
                				"version": {"type":"keyword", "index": true},
                				"checksum": {"type":"keyword", "index": true},
                				"signer": {"type":"keyword", "index": true},
                			}
                		}
                	}
                }
            }
        }
    }
}

Here, the fields ../bob/filename2 & ../bob/filename2 start with dots (..). This causes trouble in indexing my JSON data into elasticsearch. My JSON file is:

{"name":"Jonathan James","score":"9.9","address":"NewDelhi","lastUpdated":null,"firstUpdated":"86400","official":[{"suid":"c0c85dc9-e13c-41d1-88a0-6db76ded6a41","uploader":{"AGS":1544817662070,"AGM":1544817662070},"rank":1},{"suid":"c0c85dc9-e13c-41d1-88a0-6db76ded6a1f","uploader":{"AGS":1544817662070,"AGM":1544817662070},"rank":2}],"files":{"../bob/filename1":{"name":"filename1", "signed":true, "failure":null, "version":"13.0.0", "checksum":null, "signer":"Developer ID Application: ABCD"}, "../bob/filename2":{ "name":"filename2", "signed":false, "failure":"InvalidCodeSignature(-67061)", "version":"6.0.0.75", "checksum":null, "signer":"ABCD"}}}
{"name":"Sam Durram Singh","score":"8.9","address":"NewYork","lastUpdated":"1545078074640","firstUpdated":"86400","official":[{"suid":"c0c85dc9-d1d6-42bb-89b0-900e5f1e066d","uploader":{"AGS":1544817662070,"AGM":1544817662070},"rank":3},{"suid":"c0c85dc9-1d1e-4bf4-9596-e6c93d3d3dd0","uploader":{"AGS":1544817662070,"AGM":1544817662070},"rank":4}],"files":{"../bob/filename1":{"name":"filename1", "signed":true, "failureReason":null, "version":"13.0.0", "checksum":null, "signer":"Developer ID Application: ABCD"}, "../bob/filename2":{ "name":"filename2", "signed":false, "failure":"InvalidCodeSignature(-67061)", "version":"6.0.0.75", "checksum":null, "signer":"ABCD"}}}

I cannot change the field name in my JSON file. Although, I am ready change the field name in logstash before sending the data into elasticsearch. How do I achieve this?

I even tried renaming the field in the logstash mutate filter:

mutate{
rename => { "../bob/filename1" => "filename1" }
}

but this didn't work. I still get the same error:

Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"test", :_type=>"student", :routing=>nil}, #<LogStash::Event:0x71b03f5a>], :response=>{"index"=>{"_index"=>"test", "_type"=>"student", "_id"=>"UGl_OWgB9ZDL-4_eKilu", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"object field starting or ending with a [.] makes object resolution ambiguous: [../bob/filename1]"}}}}}

Data parsing error.

1 Like

Hi @zqc0512, I am aware that this error is because of the dots(..) in the field name but is there any way I can go ahead as it is? Or maybe rename the field altogether in logstash mutate filter before passing onto elasticsearch?

try data type string. with mapping

1 Like

Yes. You need to modify the field name.
I moved your question to #logstash where you can hopefully get more help on this.

1 Like

Hi @zqc0512, I cannot change the type to string as this is a nested field and can have nested type only

Hi @dadoonet, Thank you. I tried modifying the field name in my logstash file

mutate{
   rename =&gt; { "../bob/filename1" => "filename1" }
}

This did not work I was still getting the same error. Am I doing something wrong while renaming the field?

Thank you guys, I worked it out. I was not renaming the field correctly. I should have done this:

mutate{ rename => { "[../bob/filename1]" => "[filename1]" } }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.