MapperParsingException[Field name [@source.host] cannot contain '.']

I 'm running ELK stack on Redhat 6 and I'm getting the above error.

Here are some details:

./elasticsearch --version
Version: 2.2.1, Build: d045fc2/2016-03-09T09:38:54Z, JVM: 1.7.0_95
./logstash --version
logstash 2.2.2

Here is the snippet of the elasticsearch log:

2016-03-30 19:40:14,050][DEBUG][action.bulk ] [elasticsearch-1] [logstash-2016.03.30][1] failed to execute bulk item (index) index {[logstash-2016.03.30][syslog][AVPJ5htfKKsX3MOvcR2a], source[{"message":"Mar 30 19:40:12 r009tqn0c kernel: BXP-FWDEV-TCP-FIN IN=eth0 OUT= MAC=00:50:56:93:56:a7:40:55:39:03:c5:c2:08:00 SRC=10.78.142.32 DST=10.78.197.215 LEN=52 TOS=0x00 PREC=0x00 TTL=63 ID=50910 DF PROTO=TCP SPT=1389 DPT=37293 WINDOW=54 RES=0x00 ACK FIN URGP=0 ","@version":"1","@timestamp":"2016-03-30T23:40:13.402Z","path":"/var/log/messages","type":"syslog","@source.host":"r009tqn0c","@source.path":"%{file}","@source.offset":"%{offset}"}]}
MapperParsingException[Field name [@source.host] cannot contain '.']
at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:276)

Hers is a simple logstash config file definition:
input {
file {
path => "/var/log/messages"
type => "syslog"
}

}

output {
elasticsearch { hosts => ["127.0.0.1:9200"] }
stdout { codec => rubydebug }
}
~
Let me know, if there is a patch or a workaround to fix this problem.

Thanks

This doesn't make sense. Are you sure you're not running with any extra configuration files containing filters? There nothing in the configuration you've shown here that adds any @source.XXX fields.

If you are using latest elasticsearch version greater than 2.X you cannot create a field with '.'(eg source.path) instead of that you can use (source_path).

I hope it will resolve your problem.

I'm currently testing filebeat but will retest the file input again later in the week.
I'm thinking when upgrading my development VM without the filter section the new version of the yml file never got updated resulting in the error showing up again.

Thanks for the response!

Thanks Raj!