Can't get filebeat 7.x syslogs into elasticsearch

Hi,

I'm wondering if anyone else has had this problem.

I am in the process up upgrading filebeat on our CentOS 7 servers. They are running filebeat version 6.8 and our ELK stack is 7.10.1. We are planning to upgrade to filebeat 7.11.1 and I have read the doco and followed the recommenced procedure.

From what I have found, there is some issue with the new version of filebeat pushing syslog events into our elastic cloud, as I have been able to build an elasticsearch and kibana server of the same version as what we have in the cloud, and successfully publish events there using filebeat 7.11.1.

When I try to push to our elastic cloud like we have been doing with filebeat 6.8, I have noticed that the issue seems to be specifically with the filebeat-7.11.1 pipeline as shown.

2021-02-28T20:53:56.746+1000 WARN [elasticsearch] elasticsearch/client.go:408 Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Time{wall:0xc006fb6ea0062c0c, ext:40129156929, loc:(*time.Location)(0x67397c0)}, Meta:{"pipeline":"filebeat-7.11.1-system-syslog-pipeline"}, Fields:{"agent":{"ephemeral_id":"xxxxxxxx","hostname":"xxxxxxxxxxxx","id":"12f1469f-e042-4173-b469-8eb472f52346","name":"xxxxxxxxx","type":"filebeat","version":"7.11.1"},"ecs":{"version":"1.7.0"},"env":"dev","event":{"dataset":"system.syslog","module":"system","timezone":"+10:00"},"fileset":{"name":"syslog"},"host":{"name":"xxxxxxxxxxxxxx"},"input":{"type":"log"},"log":{"file":{"path":"/var/log/messages"},"offset":124248},"message":"Feb 28 20:53:39 xxxxxxxxxxxx test","service":{"type":"system"}}, Private:file.State{Id:"native::17380476-64768", PrevId:"", Finished:false, Fileinfo:(*os.fileStat)(0xc000375860), Source:"/var/log/messages", Offset:124301, Timestamp:time.Time{wall:0xc006fb6e9fdb2da2, ext:40126339288, loc:(*time.Location)(0x67397c0)}, TTL:-1, Type:"log", Meta:map[string]string(nil), FileStateOS:file.StateOS{Inode:0x109347c, Device:0xfd00}, IdentifierName:"native"}, TimeSeries:false}, Flags:0x1, Cache:publisher.EventCache{m:common.MapStr(nil)}} (status=400): {"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"illegal_argument_exception","reason":"Cannot write to a field alias [host.hostname]."}}

I can see there's a whole bunch of configuration for this, but from what I can tell this is all automatically set up by filebeat. In any case, when it connects to my test env, it sets up all the same indices, templates and pipelines as in our cloud setup.

I am able to add and see other hosts syslog events in our elastic cloud but only when using version 6.8 of filebeat and not for 7.11.1. I have also tried using filebeat 7.10.1 but saw the same error.

Any help would be appreciated.

Thanks,

I bet you're using an old pipeline template, did you try to switch to the latest one?

Hi,

Filebeat 7.11 created it when it started from what I can tell and it's filebeat-7.11.1-system-syslog-pipeline

It also created all the indices and index templates and they're also 7.11.1.

It looks like it created the 7.11.1 ILM policy also.

thanks,

Debug shows that the as far I can see the required fields are defined, especially the event.timezone as that is one of the fields the log says it can't write to:
"@timestamp": "2021-03-03T22:08:01.602Z", "@metadata": { "beat": "filebeat", "type": "_doc", "version": "7.11.1", "pipeline": "filebeat-7.11.1-system-syslog-pipeline" }, "fileset": { "name": "syslog" }, "input": { "type": "log" }, "event": { "timezone": "+10:00", "module": "system", "dataset": "system.syslog" },
and this is the error in the filebeat log saying it can't write to that field

2021-03-04T14:50:43.601+1000 WARN [elasticsearch] elasticsearch/client.go:408 Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Time{wall:0xc00837a88c25acaf, ext:7061226175330, loc:(*time.Location)(0x67397c0)}, Meta:{"pipeline":"filebeat-7.11.1-system-syslog-pipeline"}, Fields:{"agent":{"ephemeral_id":"cf529b11-1d85-4c36-b0f1-443e68085afc","hostname":"dev-ash-2centos7-9","id":"12f1469f-e042-4173-b469-8eb472f52346","name":"dev-ash-2centos7-9","type":"filebeat","version":"7.11.1"},"ecs":{"version":"1.7.0"},"env":"dev","event":{"dataset":"system.syslog","module":"system","timezone":"+10:00"},"fileset":{"name":"syslog"},"host":{"name":"dev-ash-2centos7-9"},"input":{"type":"log"},"log":{"file":{"path":"/var/log/messages"},"offset":559778},"message":"Mar 4 14:50:33 dev-ash-2centos7-9 dbus[646]: [system] Successfully activated service 'org.freedesktop.nm_dispatcher'","service":{"type":"system"}}, Private:file.State{Id:"native::17380476-64768", PrevId:"", Finished:false, Fileinfo:(*os.fileStat)(0xc000391790), Source:"/var/log/messages", Offset:559896, Timestamp:time.Time{wall:0xc00837a88c0f901c, ext:7061224726221, loc:(*time.Location)(0x67397c0)}, TTL:-1, Type:"log", Meta:map[string]string(nil), FileStateOS:file.StateOS{Inode:0x109347c, Device:0xfd00}, IdentifierName:"native"}, TimeSeries:false}, Flags:0x1, Cache:publisher.EventCache{m:common.MapStr(nil)}} (status=400): {"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"illegal_argument_exception","reason":"Cannot write to a field alias [event.timezone]."}}

I found some info on configuring filebeat with timezone/location info but it seems quite limited and I have tried all possible configs in case that was the issue, even though from what I can tell the timezone fields seems ok. Refer to Add the local time zone | Filebeat Reference [7.11] | Elastic

processors:
  - add_locale:
      format: abbreviation

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.