Autoindex seems like doesn't work

env:centos 8 x64 with firewalld selinux disabled
filebeat 7.7.0
es 7.7.0

was just enabled xpack feature in es,and now I cannot see any log in es from filebeat.before xpack enabled it works perfect.
and the error in service filebeat status :

ā— filebeat.service - Filebeat sends log files to Logstash or directly to Elasticsearch.
   Loaded: loaded (/usr/lib/systemd/system/filebeat.service; enabled; vendor preset: disabled)
   Active: active (running) since Tue 2020-05-26 15:44:13 CST; 2min 47s ago
     Docs: https://www.elastic.co/products/beats/filebeat
 Main PID: 10158 (filebeat)
    Tasks: 8 (limit: 49646)
   Memory: 14.4M
   CGroup: /system.slice/filebeat.service
           ā””ā”€10158 /usr/share/filebeat/bin/filebeat -environment systemd -c /etc/filebeat/filebeat.yml -path.home /usr/share/filebeat -path.config /etc/filebeat -path.data /var/lib/filebeat -path.logs /var/log/filebeat

May 26 15:46:19 ssl filebeat[10158]: 2020-05-26T15:46:19.728+0800        WARN        [elasticsearch]        elasticsearch/client.go:384        Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Time{wall:0xbfab5032aa9ca160, ext:125053593101, loc:(*time.Location)(0x594e5e0)}, Meta:{"pipeline":"filebeat-7.7.0-wazuh-alerts-pipeline"}, Fields:{"agent":{"ephemeral_id":"a6150cd6-ea64-40d7-83d5-0acc5f514bf6","hostname":"ssl","id":"6544352f-ef82-4777-87b9-66a84ec2d384","type":"filebeat","version":"7.7.0"},"ecs":{"version":"1.5.0"},"event":{"dataset":"wazuh.alerts","module":"wazuh"},"fields":{"index_prefix":"wazuh-alerts-3.x-"},"fileset":{"name":"alerts"},"host":{"name":"ssl"},"input":{"type":"log"},"log":{"file":{"path":"/var/ossec/logs/alerts/alerts.json"},"offset":347985746},"message":"{\"timestamp\":\"2020-05-26T15:46:15.216+0800\",\"rule\":{\"level\":3,\"description\":\"Audit: Command: /bin/sleep\",\"id\":\"80792\",\"firedtimes\":215,\"mail\":false,\"groups\":[\"audit\",\"audit_command\"],\"gdpr\":[\"IV_30.1.g\"]},\"agent\":{\"id\":\"003\",\"name\":\"device\",\"ip\":\"192.168.2.159\"},\"manager\":{\"name\":\"ssl\"},\"id\":\"1590479175.217294836\",\"full_log\":\"type=SYSCALL msg=audit(1590479177.489:87986): arch=c000003e syscall=59 success=yes exit=0 a0=55826c7e1280 a1=55826c7e1bd0 a2=55826c7df750 a3=8 items=2 ppid=9516 pid=2174 auid=1006 uid=1006 gid=1002 euid=1006..........{\"type\":\"SYSCALL\",\"id\":\"87994\",\"arch\":\"c000003e\",\"syscall\":\"59\",\"success\":\"yes\",\"exit\":\"0\",\"ppid\":\"18289\",\"pid\":\"2243\",\"auid\":\"1006\",\"uid\":\"1006\",\"gid\":\"1002\",\"euid\":\"1006\",\"suid\":\"1006\",\"fsuid\":\"1006\",\"egid\":\"1002\",\"sgid\":\"1002\",\"fsgid\":\"1002\",\"tty\":\"(none)\",\"session\":\"116\",\"command\":\"sleep\",\"exe\":\"/bin/sleep\",\"key\":\"audit-wazuh-c\",\"execve\":{\"a0\":\"sleep\",\"a1\":\"120\"},\"cwd\":\"/tank1/devnet\",\"file\":{\"name\":\"/bin/sleep\",\"inode\":\"5111893\",\"mode\":\"0100755\"}}},\"location\":\"/var/log/audit/audit.log\"}","service":{"type":"wazuh"}}, Private:file.State{Id:"", Finished:false, Fileinfo:(*os.fileStat)(0xc0002061a0), Source:"/var/ossec/logs/alerts/alerts.json", Offset:348002992, Timestamp:time.Time{wall:0xbfab5015e934d39c, ext:10030013101, loc:(*time.Location)(0x594e5e0)}, TTL:-1, Type:"log", Meta:map[string]string(nil), FileStateOS:file.StateOS{Inode:0x40a9162, Device:0xfd00}}, TimeSeries:false}, Flags:0x1, Cache:publisher.EventCache{m:common.MapStr(nil)}} (status=404): {"type":"index_not_found_exception","reason":"no such index [<wazuh-alerts-3.x-{2020.05.26||/d{yyyy.MM.dd|UTC}}>] and [action.auto_create_index] ([.monitoring*,.watches,.triggered_watches,.watcher-history*,.ml*,wazuh-alerts-3.x-*,wazuh-monitoring-3.x-*]) doesn't match","index_uuid":"_na_","index":"<wazuh-alerts-3.x-{2020.05.26||/d{yyyy.MM.dd|UTC}}>"}

it seems like the index wasn't auto created,but I've add auto index create config in es config file alreay,the es configration file like this:

[root@ssl alerts]# cat /etc/elasticsearch/elasticsearch.yml | egrep -v "^#|^$"
cluster.name: wazuh-clusteres
node.name: wazuh-ssl
path.data: /var/lib/elasticsearch
path.logs: /var/log/elasticsearch
network.host: 192.168.40.243
cluster.initial_master_nodes: ["wazuh-ssl"]
xpack.security.enabled: true
action.auto_create_index: .monitoring*,.watches,.triggered_watches,.watcher-history*,.ml*,wazuh-alerts-3.x-*,wazuh-monitoring-3.x-*
xpack.security.audit.enabled: true
xpack.monitoring.enabled: true
xpack.monitoring.collection.enabled: true
xpack.security.transport.ssl.enabled: true
xpack.security.transport.ssl.verification_mode: certificate
xpack.security.transport.ssl.key: /etc/elasticsearch/certs/elasticsearch.key
xpack.security.transport.ssl.certificate: /etc/elasticsearch/certs/elasticsearch.crt
xpack.security.transport.ssl.certificate_authorities: [ "/etc/elasticsearch/certs/ca/ca.crt" ]
xpack.security.http.ssl.enabled: true
xpack.security.http.ssl.verification_mode: certificate
xpack.security.http.ssl.key: /etc/elasticsearch/certs/elasticsearch.key
xpack.security.http.ssl.certificate: /etc/elasticsearch/certs/elasticsearch.crt
xpack.security.http.ssl.certificate_authorities: [ "/etc/elasticsearch/certs/ca/ca.crt" ]

the filebeat configration file :

[root@ssl alerts]# cat /etc/filebeat/filebeat.yml | egrep -v "^#|^$"
filebeat.modules:
  - module: wazuh
    alerts:
      enabled: true
    archives:
      enabled: false
setup.template.json.enabled: true
setup.template.json.path: '/etc/filebeat/wazuh-template.json'
setup.template.json.name: 'wazuh'
setup.template.overwrite: true
setup.ilm.enabled: false
output.elasticsearch:
  hosts: ["https://192.168.40.243:9200"]
  username: "elastic"
  password: ####hidden here#####
  ssl.certificate: "/etc/filebeat/certs/wazuh-manager.crt"
  ssl.key: "/etc/filebeat/certs/wazuh-manager.key"
  ssl.certificate_authorities: ["/etc/filebeat/certs/ca/ca.crt"]
1 Like

and filebeat test config -e ,test output here seems everything was fine

[root@ssl alerts]# filebeat test config -e.
..
2020-05-26T16:40:03.429+0800    INFO    beater/filebeat.go:92   Enabled modules/filesets: wazuh (alerts),  ()
Config OK
[root@ssl alerts]# filebeat test output
elasticsearch: https://192.168.40.243:9200...
  parse url... OK
  connection...
    parse host... OK
    dns lookup... OK
    addresses: 192.168.40.243
    dial up... OK
  TLS...
    security: server's certificate chain verification is enabled
    handshake... OK
    TLS version: TLSv1.3
    dial up... OK
  talk to server... OK
  version: 7.7.0
[root@ssl alerts]#

What do the Filebeat logs show?

thanks for reply,the filebeat log in /var/log/filebeat seems normal:

[root@ssl filebeat]# cat filebeat
2020-05-26T16:40:08.471+0800    INFO    instance/beat.go:621    Home path: [/usr/share/filebeat] Config path: [/etc/filebeat] Data path: [/var/lib/filebeat] Logs path: [/var/log/filebeat]
2020-05-26T16:40:08.471+0800    INFO    instance/beat.go:629    Beat ID: 6544352f-ef82-4777-87b9-66a84ec2d384
2020-05-26T16:40:08.472+0800    INFO    [index-management]      idxmgmt/std.go:182      Set output.elasticsearch.index to 'filebeat-7.7.0' as ILM is enabled.
2020-05-26T16:40:08.473+0800    INFO    eslegclient/connection.go:84    elasticsearch url: https://192.168.40.243:9200
2020-05-26T16:40:08.527+0800    INFO    [esclientleg]   eslegclient/connection.go:263   Attempting to connect to Elasticsearch version 7.7.0
2020-05-26T16:40:08.529+0800    INFO    [license]       licenser/es_callback.go:51      Elasticsearch license: Platinum
[root@ssl filebeat]# cat filebeat.7

what may useful here I think was the service filebeat status,key log here:

May 27 14:01:46 ssl filebeat[7490]: 2020-05-27T14:01:46.971+0800        WARN        [elasticsearch]        elasticsearch/client.go:384        Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Time{wall:0xbfab9e7279b4db2c, ext:95069650001, loc:(*time.Location)(0x594e5e0)}, Meta:{"pipeline":"filebeat-7.7.0-wazuh-alerts-pipeline"}, Fields:{"agent":{"ephemeral_id":"633f72cd-92f6-47ab-8209-eb4ebf6fceda","hostname":"ssl","id":"36a1c9a1-e768-428a-957d-e9d12340ae32","type":"filebeat","version":"7.7.0"},"ecs":{"version":"1.5.0"},"event":{"dataset":"wazuh.alerts","module":"wazuh"},"fields":{"index_prefix":"wazuh-alerts-3.x-"},"fileset":{"name":"alerts"},"host":{"name":"ssl"},"input":{"type":"log"},"log":{"file":{"path":"/var/ossec/logs/alerts/alerts.json"},"offset":1454136},"message":"{\"timestamp\":\"2020-05-27T14:01:45.381+0800\",\"rule\":{\"level\":3,\"description\":\"Audit: Command: /bin/sleep\",\"id\":\"80792\",\"firedtimes\":24,\"mail\":false,\"groups\":[\"audit\",\"audit_command\"],\"gdpr\":[\"IV_30.1.g\"]},\"agent\":{\"id\":\"003\",\"name\":\"device\",\"ip\":\"192.168.2.159\"},\"manager\":{\"name\":\"ssl\"},\"id\":\"1590559305.1309931\",\"full_log\":\"type=SYSCALL msg=audit(1590559308.062:121371): arch=c000003e syscall=59 success=yes exit=0 a0=55f3b1df52a0 a1=55f3b1df5c50 a2=55f3b1df3880 a3=8 items=2 ppid=5452 pid=12011 auid=1007 uid=1007 gid=1002 euid=1007 suid=1007 fsuid=1007 egid=1002 sgid=1002 fsgid=1002 tty=(none) ses=4337 comm=\\\"sleep\\\" exe=\\\"/bin/sleep\\\" key=\\\"audit-wazuh-c\\\" type=EXECVE msg=audit(1590559308.062:121371): argc=2 a0=\\\"sleep\\\" a1=\\\"120\\\" type=CWD msg=audit(1590559308.062:121371): cwd=\\\"/tank2/testnet\\\" type=PATH msg=audit(1590559308.062:121371): item=0 name=\\\"/bin/sleep\\\" inode=5111893 dev=103:02 mode=0100755 ouid=0 ogid=0 rdev=00:00 nametype=NORMAL cap_fp=0000000000000000 cap_fi=0000000000000000 cap_fe=0 cap_fver=0 type=PATH msg=audit(1590559308.062:121371): item=1 name=\\\"/lib64/ld-linux-x86-64.so.2\\\" inode=6291858 dev=103:02 mode=0100755 ouid=0 ogid=0 rdev=00:00 nametype=NORMAL cap_fp=0000000000000000 cap_fi=0000000000000000 cap_fe=0 cap_fver=0 type=PROCTITLE msg=audit(1590559308.062:121371): proctitle=736C65657000313230\",\"decoder\":{\"parent\":\"auditd\",\"name\":\"auditd\"},\"data\":{\"audit\":{\"type\":\"SYSCALL\",\"id\":\"121371\",\"arch\":\"c000003e\",\"syscall\":\"59\",\"success\":\"yes\",\"exit\":\"0\",\"ppid\":\"5452\",\"pid\":\"12011\",\"auid\":\"1007\",\"uid\":\"1007\",\"gid\":\"1002\",\"euid\":\"1007\",\"suid\":\"1007\",\"fsuid\":\"1007\",\"egid\":\"1002\",\"sgid\":\"1002\",\"fsgid\":\"1002\",\"tty\":\"(none)\",\"session\":\"4337\",\"command\":\"sleep\",\"exe\":\"/bin/sleep\",\"key\":\"audit-wazuh-c\",\"execve\":{\"a0\":\"sleep\",\"a1\":\"120\"},\"cwd\":\"/tank2/testnet\",\"file\":{\"name\":\"/bin/sleep\",\"inode\":\"5111893\",\"mode\":\"0100755\"}}},\"location\":\"/var/log/audit/audit.log\"}","service":{"type":"wazuh"}}, Private:file.State{Id:"", Finished:false, Fileinfo:(*os.fileStat)(0xc00029a0d0), Source:"/var/ossec/logs/alerts/alerts.json", Offset:1456036, Timestamp:time.Time{wall:0xbfab9e5ab86bfc8c, ext:48097301, loc:(*time.Location)(0x594e5e0)}, TTL:-1, Type:"log", Meta:map[string]string(nil), FileStateOS:file.StateOS{Inode:0x4071abe, Device:0xfd00}}, TimeSeries:false}, Flags:0x1, Cache:publisher.EventCache{m:common.MapStr(nil)}} (status=404): {"type":"index_not_found_exception","reason":"no such index [<wazuh-alerts-3.x-{2020.05.27||/d{yyyy.MM.dd|UTC}}>] and [action.auto_create_index] ([.monitoring*,.watches,.triggered_watches,.watcher-history*,.ml*,wazuh-alerts-3.x-*,wazuh-monitoring-3.x-*]) doesn't match","index_uuid":"_na_","index":"<wazuh-alerts-3.x-{2020.05.27||/d{yyyy.MM.dd|UTC}}>"}

full log:https://pastebin.com/fGrpQzQG

it seems like filebeat cannot create index automatically,or cannot get the specific index's uuid,I tried create the index manually(wazuh-alerts-3.x-2020.05.26),then it recovery to works fine yesterday,but today it still didn't create the index automatically(as you can see it should like <wazuh-alerts-3.x-{2020.05.27||/d{yyyy.MM.dd|UTC}}>).then I manually clear all the files in /var/lib/filebeat,restart filebeat it created the wazuh-alerts-3.x-2020.05.27 index automatically,but still got same warnning info above and still no docs in this wazuh-alerts-3.x-2020.05.27

after set logging.level:debug in filebeat.yml,the filebeat log here:

https://pastebin.com/TByL93E3

help pls,this problem still here and with days debug I still have no clue about it

by enable xpack audit,I can see this log in wazuh-clusteres_audit.json,but I was giving the elastic user to filebeat for use(which should be superuser),why here showed an deny action?

[root@ssl elasticsearch]# tail -n 500 wazuh-clusteres_audit.json  | grep wazuh
{"type":"audit", "timestamp":"2020-05-29T19:50:00,619+0800", "node.id":"s9cmGA6ZSF6LgU9zngYAQA", "event.type":"transport", "event.action":"access_denied", "user.name":"elastic", "user.realm":"reserved", "user.roles":["superuser"], "origin.type":"rest", "origin.address":"192.168.40.243:46532", "request.id":"aU8oObU3Sv2ez_WNnUSw0g", "action":"indices:data/read/field_caps", "request.name":"FieldCapabilitiesRequest", "indices":["wazuh-alerts-3.x-*"]}

Hi warkolm,The problem has been solved by remove the line

action.auto_create_index: .monitoring*,.watches,.triggered_watches,.watcher-history*,.ml*,wazuh-alerts-3.x-*,wazuh-monitoring-3.x-*

in elasticsearch.yml,then filebeat could create and push docs to es normally.still doesn't found a clear clue why since I've been already added the wazuh-alerts-3.x-,wazuh-monitoring-3.x- in it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.