Default mapping Location

Hello,

Is there anyone has experience importing MySQL database through Logstash to ElasticSearch? I have issue with the default mapping that LogStash provides. It always convert date to string and Integer to Long. For example, I have table that contains employee data with employee_id, createDate, and other attributes. When I import, it auto convert the employee_id to long and createDate to string. Is there anyway to use other mapping for importing or change the default mapping to fit the need. Also, where is the file for default mapping?

Thank you.

You should really define your own mapping before indexing this sort of data.
The default template is primarily for logging due to the history of the app.

Thank you for your response. I tried this also. I manually created the mapping as
{
"settings": {
"index": {
"analysis": {
"char_filter": {
"dot_pattern": {
"type": "pattern_replace",
"pattern": "\.",
"replacement": " "
}
},
"analyzer": {
"dot_analyzer": {
"tokenizer": "standard",
"char_filter": [
"dot_pattern"
]
}
}
}
}
},
"mappings": {
"employees": {
"properties": {
"index_id": {
"type": "string",
"_index": {
"enabled": false
},
"store": true
},
"report_id": {
"type": "integer",
"index": "not_analyzed",
"store": true
},
"title": {
"type": "string",
"index": "analyzed",
"store": true
},
"Comment": {
"type": "string",
"index": "analyzed",
"store": true
},
"Author": {
"type": "string",
"index": "analyzed",
"store": true,
"analyzer": "dot_analyzer"
},
"sqlDate": {
"type": "date",
"index": "not_analyzed",
"store": true
},
"category": {
"type": "string",
"index": "analyzed",
"store": true
},
"report_type": {
"type": "string",
"index": "not_analyzed",
"store": true
},
"report_subtype": {
"type": "string",
"index": "not_analyzed",
"store": true
},
"Active": {
"type": "boolean",
"index": "not_analyzed",
"store": true
},
"InactiveBy": {
"type": "string",
"index": "analyzed",
"store": true,
"analyzer": "dot_analyzer"
},
"InactiveDate": {
"type": "date",
"index": "not_analyzed",
"store": true
}
}
}
}
}

However, Logstash failed to import with 400 error. Do you have example about make this work (or steps you took)?

Thank you

However, Logstash failed to import with 400 error.

More details, please. If there's nothing more in the Logstash log please consult the Elasticsearch log.

I am sorry, I have check both logstash and elasticsearch log. They didn't show anything useful. Do you have any suggestion about where to look for more informaton?

From the failed action with response of 400, there is this line:

<LogStash::Event:0x3ff4b34d @metadata_accessors=#<LogStash::Util::Accessors:0x23053ab4 @store={"retry_count"=>0}, @lut={}>, @cancelled=false,

Is that mean anything?