Hello.
I would like to ask question about creating a index in elasticsearch.
My Question
Why is elasicsearch rejecting the indexing for field which has type : long?
What am I trying to do?
I am indexing the linux syslog into elasticsearch using grok in logstash.
Below is my grok definition.
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
What is happening?
Elasticsearch is rejecting the indexing as below.
[2016-07-05 17:12:03,217][DEBUG][action.admin.indices.create] [node-1] [mksyslog-2016.07.05] failed to create
MapperParsingException[Failed to parse mapping [syslog_severity_code]: Root mapping definition has unsupported parameters: [type : long]]; nested: MapperParsingException[Root mapping definition has unsupported parameters: [type : long]];
I use the dynamic template to auto create the index.
Below is my whole PUT query when I created the template.
syslog_severity_code is defined as type.
PUT /_template/syslog_template
{
"template": "*syslog*",
"settings": {
"index": {
"number_of_shards": "5",
"number_of_replicas": "1"
}
},
"mappings": {
"tags": {
"index": "not_analyzed",
"type": "string"
},
"syslog_severity_code": {
"type": "long"
},
"syslog_facility_code": {
"type": "long"
},
"host": {
"index": "not_analyzed",
"type": "string"
},
"syslog_message": {
"index": "not_analyzed",
"type": "string"
},
"syslog_program": {
"index": "not_analyzed",
"type": "string"
},
"syslog_facility": {
"index": "not_analyzed",
"type": "string"
},
"syslog_hostname": {
"index": "not_analyzed",
"type": "string"
},
"@version": {
"index": "not_analyzed",
"type": "string"
},
"syslog_pid": {
"index": "not_analyzed",
"type": "string"
},
"message": {
"type": "string"
},
"path": {
"index": "not_analyzed",
"type": "string"
},
"syslog_severity": {
"index": "not_analyzed",
"type": "string"
},
"syslog_timestamp": {
"index": "not_analyzed",
"type": "string"
},
"@timestamp": {
"format": "strict_date_optional_time||epoch_millis",
"type": "date"
},
"type": {
"index": "not_analyzed",
"type": "string"
},
"received_from": {
"index": "not_analyzed",
"type": "string"
},
"received_at": {
"format": "strict_date_optional_time||epoch_millis",
"type": "date"
}
}
}
My platform environment is below.
OS : RHEL 7.2 64bit
Elasticsearch : 2.3.3
Logstash : 2.3.3
Any help is appreciated!