Mapper_parsing_exception : elasticsearch json mapping with text

Hi....

I logging nginx log with json, include request_body.

My structure is [server - logstash - logstash - elasticsearch - kibana].

Request_body's content is also json with escape characters, and sometimes just plain text.

That's is problem..!!

I successfully configure logstash config, but sometimes error occurred, when request_body is text.

How to I configure elasticsearch index template - mapping?

I want to receive 1)+2) both type.

"request_body": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}

"request_body": {
"properties": {
"enveloped_message": {
"properties": {
"channel": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"app_version": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}, ....

Plz help me ㅠㅠ

Thanks.

A single field in an index can only be mapped one way, so the request_body field can not be mapped as both text and object in the same index. Either you need to change the structure so they are aligned, e.g. by moving the text content into a field under request_body, or store them in different indices.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.