This begins as a filebeat issue but I think it's now a matter of elasticsearch index.
I'm seeing repeated messages like this in our logging. I can see this is related to the nginx module but I'm unsure how to go about fixing it (custom mapping, enable dynamic mapping, edit module, ?).
I know it's related to the nginx filebeat module. This entry seems to be key:
{"type":"mapper_parsing_exception","reason":"failed to parse field [user_agent.version] of type [date] in document
full log entry:
Mar 6 07:22:27 rgo032 filebeat[4914]: 2023-03-06T07:22:27.103-0800#011WARN#011[elasticsearch]#011elasticsearch/client.go:414#011Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Date(2023, time.March, 6, 7, 22, 26, 929462784, time.Local), Meta:{"pipeline":"filebeat-7.17.8-nginx-access-pipeline"}, Fields:{"agent":{"ephemeral_id":"2be6b958-12f3-4e22-ad23-d5a9021edd15","hostname":"rgo032","id":"3b97f44c-65e6-4d28-ab3f-a4e80ff361fb","name":"rgo032","type":"filebeat","version":"7.17.8"},"ecs":{"version":"1.12.0"},"event":{"dataset":"nginx.access","module":"nginx","timezone":"-08:00"},"fileset":{"name":"access"},"host":{"name":"rgo032"},"input":{"type":"log"},"log":{"fi
le":{"path":"/var/log/nginx/access.log"},"offset":2643238},"message":"192.168.1.15 - stevans [06/Mar/2023:07:22:26 -0800] \"PROPFIND /remote.php/dav/files/stevans/ HTTP/1.1\" 207 331 \"-\" \"Mozilla/5.0 (Macintosh) mirall/2.10.0 (build 6519) (ownCloud, osx-21.1.0 ClientArchitecture: x86_64 OsArchitecture: x86_64)\"","service":{"type":"nginx"}}, Private:file.State{Id:"native::12845627-64769", PrevId:"", Finished:false, Fileinfo:(*os.fileStat)(0xc000a14a90), Source:"/var/log/nginx/access.log", Offset:2643473, Timestamp:time.Date(2023, time.March, 6, 0, 0, 3, 582773674, time.Local), TTL:-1, Type:"log", Meta:map[string]string(nil), FileStateOS:file.StateOS{Inode:0xc4023b, Device:0xfd01}, IdentifierName:"native"}, TimeSeries:false}, Flags:0x1, Cache:publisher.EventCache{m:common.MapStr(nil)}} (status=400): {"type":"mapper_parsing_exception","reason":"failed to parse field [user_agent.version] of type [date] in document with id 'GmmEt4YBeNusebpST5KS'. Preview of field's value: '2.10.0'","caused_by":{"type":"illegal_argument_exception","reason":"failed to parse date field [2.10.0] with format [strict_date_optional_time||epoch_millis]","caused_by":{"type":"date_time_parse_exception","reason":"Failed to parse with all enclosed parsers"}}}, dropping event!
/var/log/nginx/access.log:
192.168.1.15 - stevans [06/Mar/2023:07:21:24 -0800] "PROPFIND /remote.php/dav/files/stevans/ HTTP/1.1" 207 331 "-" "Mozilla/5.0 (Macintosh) mirall/2.10.0 (build 6519) (ownCloud, osx-21.1.0 ClientArchitecture: x86_64 OsArchitecture: x86_64)"
I was able to do some more investigation here and found a difference between our two indices:
GET filebeat-7.17.8/_mapping/field/user_agent.version
...
"filebeat-7.17.8" : {
"mappings" : {
"user_agent.version" : {
"full_name" : "user_agent.version",
"mapping" : {
"version" : {
"type" : "date"
...
GET filebeat-7.17.9/_mapping/field/user_agent.version
...
"filebeat-7.17.9" : {
"mappings" : {
"user_agent.version" : {
"full_name" : "user_agent.version",
"mapping" : {
"version" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
...
If I understand this correctly, the filebeat-7.17.8 index is incorrect as the "type" : "date"
doesn't seem right compared to the filebeat-7.17.9 index. Did this happen because of a few funky log entries?
Is the only way to fix this is reindexing or dropping the index?