Not send logs to elasticsearch

filebeat.yml

filebeat.inputs:
- type: container
  enabled: true
  paths:
    - /var/lib/docker/containers/*/*.log
  processors:
    - add_docker_metadata: ~

output.elasticsearch:
  hosts: ["172.20.111.199:9200"]
  pipeline: "postfix-pipeline"
  index: "mailcow-postfix-%{+yyyy.MM.dd}"

setup.template:
  name: "postfix"
  pattern: "postfix-*"
  overwrite: true

# Logging configuration
logging.level: info
logging.to_files: true
logging.files:
  path: /var/log/filebeat
  name: filebeat
  keepfiles: 2
  permissions: 0644

# Required for filebeat modules
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false

# Processors
processors:
  - add_host_metadata: ~
  - add_cloud_metadata: ~

Index Template:

{
  "index_patterns": ["postfix-*"],
  "data_stream": {},
  "priority": 500,
  "template": {
    "mappings": {
      "properties": {
        "@timestamp": { "type": "date" },
        "message": { "type": "text" },
        "syslog_host": { "type": "keyword" },
        "program": { "type": "keyword" },
        "pid": { "type": "integer" },
        "log": { "type": "text" }
      }
    }
  }
}'

Ingest Pipeline:

curl -X PUT "http://172.20.111.199:9200/_ingest/pipeline/postfix-pipeline" -H 'Content-Type: application/json' -d'
{
  "description": "Pipeline for parsing Postfix logs",
  "processors": [
    {
      "grok": {
        "field": "log",
        "patterns": [
          "%{SYSLOGTIMESTAMP:timestamp} %{SYSLOGHOST:syslog_host} %{DATA:program}(?:\\[%{POSINT:pid}\\])?: %{GREEDYDATA:message}"
        ]
      }
    },
    {
      "date": {
        "field": "timestamp",
        "formats": ["MMM  d HH:mm:ss", "MMM dd HH:mm:ss"],
        "timezone": "UTC"
      }
    }
  ]
}'

{"log.level":"error","@timestamp":"2024-07-12T15:35:54.530+0530","log.logger":"publisher_pipeline_output","log.origin":{"function":"github.com/elastic/beats/v7/libbeat/publisher/pipeline.(*netClientWorker).run","file.name":"pipeline/client_worker.go","file.line":148},"message":"Failed to connect to backoff(elasticsearch(http://172.20.111.199:9200)): Connection marked as failed because the onConnect callback failed: error loading template: failed to put data stream: could not put data stream: 400 Bad Request: {\"error\":{\"root_cause\":[{\"type\":\"illegal_argument_exception\",\"reason\":\"no matching index template found for data stream [postfix]\"}],\"type\":\"illegal_argument_exception\",\"reason\":\"no matching index template found for data stream [postfix]\"},\"status\":400}. Response body: {\"error\":{\"root_cause\":[{\"type\":\"illegal_argument_exception\",\"reason\":\"no matching index template found for data stream [postfix]\"}],\"type\":\"illegal_argument_exception\",\"reason\":\"no matching index template found for data stream [postfix]\"},\"status\":400}","service.name":"filebeat","ecs.version":"1.6.0"}

Please help to sort the issue.

Thanks so much for reaching out @Prabhath_samarasingh and sending us your configuration files. Please provide some further context as to what you were expecting. Did this issue recently appear, or is this your first time setting this up? Are you seeing any error messages?

This is new setup and I need to send mailcow-dockerize container logs (Specially postfix maillogs) to external ELK stack.I done below steps.

nano /etc/logstash/conf.d/postfix.conf
input {
    gelf {
        port => 12201
    }
}

filter {
    # Add any filters if needed
}
output {
    elasticsearch {
        hosts => ["http://localhost:9200"]
        index => "postfix-logs-%{+YYYY.MM.dd}"
        document_type => "_doc"  # Ensure this is set correctly
    }
}

systemctl restart logstash

logstash working fine.

root@elk:/etc/logstash/conf.d# curl -X GET "localhost:9200/_cat/indices?v"
health status index                                         uuid                   pri rep docs.count docs.deleted store.size pri.store.size
green  open   .monitoring-es-7-2024.07.20                   bQ7lVbTVSkmaVgCfwHqlyA   1   0     268648        59826    132.7mb        132.7mb
yellow open   mailcow-postfix-2024.07.12                    ipsYFLgVT06QyDa8Vq2hww   1   1          2            0     15.5kb         15.5kb
green  open   .monitoring-es-7-2024.07.21                   xAQKrO-GQTK2M46CPAr03A   1   0      37388        46582     37.3mb         37.3mb
green  open   .geoip_databases                              F1q0oCjhRV6xpwqTO5Q0Hg   1   0         33           33     31.2mb         31.2mb
yellow open   test-index                                    -2Ez11rRSx6hAsVoHM4U9g   1   1          2            0        4kb            4kb
green  open   .apm-custom-link                              l-_uzJz9SNGJHFJZOY4aOw   1   0          0            0       226b           226b
green  open   .monitoring-kibana-7-2024.07.21               UpNel2g0SiGFbtrLDbTSGg   1   0       2358            0    814.5kb        814.5kb
green  open   .monitoring-kibana-7-2024.07.20               2BVtbnL0QFecvb7cDuo8tQ   1   0      17280            0      3.3mb          3.3mb
green  open   .watches                                      sCFRuGXHQnqwNN3F_q4Nvw   1   0          0            0     16.5kb         16.5kb
yellow open   docker-logs-2024.07.15                        2WoFu-c5TTqqbrIA8eC33g   1   1          1            0      7.4kb          7.4kb
green  open   .monitoring-es-7-2024.07.15                   C4Kds5TlSMWgAh0I6tW1zg   1   0     265368       198276    137.4mb        137.4mb
green  open   .monitoring-es-7-2024.07.17                   LVmUmZGWS9Co0watOhkV5w   1   0     268679        59397    131.1mb        131.1mb
green  open   .monitoring-es-7-2024.07.16                   lBpuVPh5R6ajxXFWc_U99Q   1   0     268679        59163    130.7mb        130.7mb
green  open   .kibana_7.17.9_001                            s7XSiMB1RCSBXQha69yubg   1   0       1156          387      2.7mb          2.7mb
green  open   .monitoring-es-7-2024.07.19                   cHX_rRKMR-697rUGtJgXhQ   1   0     268873        59865    133.5mb        133.5mb
green  open   .apm-agent-configuration                      Vnro9Y3rRxiA2qRAbM2UcQ   1   0          0            0       226b           226b
green  open   .monitoring-es-7-2024.07.18                   zSABtPcpRNmFetvd4HnSwA   1   0     268757        59631    131.7mb        131.7mb
green  open   .kibana_task_manager_7.17.9_001               r7fcqXfbTS69dHFxJdsIjg   1   0         17       442626     68.6mb         68.6mb
green  open   .tasks                                        V7m74g7WSVa9cB20WO0I5Q   1   0         34            0     40.7kb         40.7kb
green  open   .monitoring-kibana-7-2024.07.18               wvj7xTCSTtWFfecT-XTKwg   1   0      17280            0      3.4mb          3.4mb
green  open   .monitoring-kibana-7-2024.07.19               E3tJAytoRayXcLhTrftukg   1   0      17280            0      3.3mb          3.3mb
green  open   .monitoring-kibana-7-2024.07.16               jLQiLgfZRaqmPKLcusXN6g   1   0      17278            0      3.4mb          3.4mb
green  open   .monitoring-kibana-7-2024.07.17               LOnpXC9kSfy-Op0eLzzHjA   1   0      17280            0      3.4mb          3.4mb
green  open   .monitoring-kibana-7-2024.07.15               ObFrWrBRT6OnPx4RH8FT4w   1   0      17268            0      3.4mb          3.4mb
green  open   kibana_sample_data_logs                       Q_yhkZEzS0-CIX_bLfeppQ   1   0      14074            0      8.9mb          8.9mb
green  open   .async-search                                 -WI29_VGQ_6boCRWNbrf8Q   1   0          0         1920    166.7kb        166.7kb
yellow open   .ds-postfix-logs-2024.07.21-2024.07.21-000001 dGt_-ZUeQ7iaAL4Ukb21zQ   1   1          0            0       226b           226b
root@elk:/etc/logstash/conf.d# tail -f /var/log/logstash/logstash-plain.log
[2024-07-21T12:59:52,634][WARN ][logstash.outputs.elasticsearch][main][c474b4498066924f724bc082a682d60c6409cd1c08aed5378f7fb74217ee038f] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"postfix-logs-2024.07.21", :routing=>nil, :_type=>"_doc"}, {"image_name"=>"mailcow/postfix:1.74", "@timestamp"=>2024-07-21T07:29:52.518Z, "created"=>"2024-07-21T06:54:56.352131938Z", "source_host"=>"172.20.242.100", "@version"=>"1", "image_id"=>"sha256:6868e495845eaef5902bc2f4009bb1644a048ac856f62e34f83c1b2a4b3db15e", "container_id"=>"1becfcfb79e02d7a4dd2e9cd88b2738c9d638fc1369ce4c89136b4fae91e716f", "container_name"=>"mailcowdockerized-postfix-mailcow-1", "tag"=>"postfix", "command"=>"/docker-entrypoint.sh /bin/sh -c exec /usr/bin/supervisord -c /etc/supervisor/supervisord.conf", "level"=>6, "message"=>"Jul 21 12:59:52 1becfcfb79e0 postfix/smtps/smtpd[567]: disconnect from unknown[194.169.175.65] ehlo=1 auth=0/1 rset=1 commands=2/3", "host"=>"mail.mailcow-devops.sltidc.lk", "version"=>"1.1"}], :response=>{"index"=>{"_index"=>"postfix-logs-2024.07.21", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"only write ops with an op_type of create are allowed in data streams"}}}}
root@elk:/etc/logstash/conf.d# curl -X GET "localhost:9200/postfix-logs-*/_search?pretty"
{
  "took" : 0,
  "timed_out" : false,
  "_shards" : {
    "total" : 1,
    "successful" : 1,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 0,
      "relation" : "eq"
    },
    "max_score" : null,
    "hits" : [ ]
  }
}

So I need to rectify below issue.

 "error"=>{"type"=>"illegal_argument_exception", "reason"=>"only write ops with an op_type of create are allowed in data streams"}}}}

Thanks for following up. This older forum post might be helpful to look at here.

I have achieved using below configurations.

Thanks for sending this over, @Prabhath_samarasingh. Can you provide some further context as to what you were expecting to see versus what you are seeing?

Received mail delivery logs without issue.

1 Like

Thanks for following up, @Prabhath_samarasingh. Does that mean the issue has been solved?

Still checking.
This is done to test setup and will configure production environment in near future.
Need time to confirm.

Thanks, @Prabhath_samarasingh; let me know if you have specific questions or if there is anything I can help with further here.