No error in logstash. But no data in kibana

I don't see any indices created in amazon ES.

architecture filebeat-> logstash-> amazon ES.

  1. filebeat conf
filebeat.inputs:
- type: log
  paths:
    - /var/test.log
  fields:
    tags: test
    environment: uflek
    product: pxl
    datacenter: aws
    partition: uflek0
    server_role: uflek-uflek-ppo-uflek0-dxl_hub
  fields_under_root: true


filebeat.registry.path: '/var/lib/filebeat/.registry'

output.logstash:
  hosts: ["logsrv.vision-uflek.local:5044"]
  key: "uflek-ppo"
  db: 0
  db_topology: 1
  timeout: 5
  reconnect_interval: 1
shipper:
logging:
  to_syslog: false
  to_files: true
  files:
    path: /var/log/filebeat
    name: filebeat.log
    rotateeverybytes: 10485760 # = 10MB
    keepfiles: 7
  level: debug 

There is no error in filebeat log

2021-01-25T16:35:38.636Z        INFO    [publisher]     pipeline/retry.go:217     done
2021-01-25T16:35:38.637Z        DEBUG   [logstash]      logstash/async.go:172   5 events out of 5 events sent to logstash host logsrv.mvision-uflek.local:5044. Continue sending
2021-01-25T16:35:38.637Z        DEBUG   [logstash]      logstash/async.go:128   close connection
2021-01-25T16:35:38.637Z        DEBUG   [logstash]      logstash/async.go:128   close connection
2021-01-25T16:35:38.637Z        ERROR   [logstash]      logstash/async.go:280   Failed to publish events caused by: client is not connected 

logstash conf

input {
  beats {
    port => 5044
  }
}
output
{
   if [fields][product] == "pxl"
   {
     amazon_es
      {
          hosts => ["https://test.es.amazonaws.com/"]
          region => "us-east-1"
          index => "%{[fields][server_role]}-%{+YYYY.MM.dd}"
          document_id => "%{fingerprint}" # avoid duplications
      }
   }
 stdout { codec => rubydebug }
}

[2021-01-25T16:04:02,609][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.47}
[2021-01-25T16:04:02,636][INFO ][logstash.inputs.beats    ][main] Starting input listener {:address=>"0.0.0.0:5044"}
[2021-01-25T16:04:02,663][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2021-01-25T16:04:02,834][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2021-01-25T16:04:02,893][INFO ][org.logstash.beats.Server][main][d2ccf63fb390d27e8a591220535b5a6601cb82de678e4bebe6d2acde54350022] Starting server on port: 5044
[2021-01-25T16:04:03,259][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600} ```

the output prints to console.

{
    "@timestamp" => 2021-01-25T16:42:32.866Z,
      "@version" => "1",
         "input" => {
        "type" => "log"
    },
          "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
          "host" => {
        "name" => "ip-################"
    },
         "agent" => {
        "ephemeral_id" => "601b382e-b08e-4820-8cf4-137fbead3918",
                  "id" => "93751d22-bcc0-4a18-9b7c-4bff3bd81413",
                "name" => "ip-###############",
                "type" => "filebeat",
            "hostname" => "ip-###############",
             "version" => "7.9.3"
    },
           "log" => {
         "flags" => [
            [0] "multiline"
        ],
        "offset" => 18416319,
          "file" => {
            "path" => "/var/test.log"
        }
    },
       "message" => "2021-01-25T16:42:24,246 INFO  [http-nio-8080-exec-5] [] [] [/dxdxdxs/v1/status] [] AuditLogFilter \n                    [] - ResourceType is empty, not doing audit logging",
        "fields" => {
        "environment" => "uflek",
            "product" => "pxl",
         "datacenter" => "aws",
               "tags" => "test"
    },
           "ecs" => {
        "version" => "1.5.0"
    }
}

please help

This is weird, because you are setting fields_under_root set to true in your filebeat.yml so you shouldn't have a field named fields, but in your output example you have a field named fields with the added fields.

With fields_under_root set to true you should have a field named product and not a field named fields.product, which is the one used in your logstash conditional.

Did you change anything to generate this output example? Can you try to remove the fields_under_root from your filebeat.yml and run your pipeline again?

There were other server which is configured to send logs to logstash.

Now that I have stopped the filebeat service.

please find the logstash output.

}
{
        "message" => "[140196633831296] 01/25/2021 17:08:30 [E]  OpenSSL Error: error:140780E5:SSL routines:ssl23_read:ssl handshake failure. Peer: 10.104.232.59:48368",
        "product" => "pxl",
      "partition" => "uflek0",
           "tags" => [
        [0] "dxlhub",
        [1] "beats_input_codec_plain_applied"
    ],
          "input" => {
        "type" => "log"
    },
     "@timestamp" => 2021-01-25T17:08:30.770Z,
       "@version" => "1",
           "host" => {
        "name" => "ip-XXXXXXXXXXXXXXXX.ec2.internal"
    },
            "log" => {
          "file" => {
            "path" => "/var/test.log"
        },
        "offset" => 257861
    },
    "server_role" => "uflek-uflek-epo-uflek0-pxl",
    "environment" => "uflek",

I need help to check why logs are not going to ES.

Your logstash output has a conditional against the field [fields][product], but you do not have this field in your document, so this conditional will always be false.

You have fields_under_root set to true in your filebeat.yml, so every field you add will be in the root of your document, you won't have [fields][product] you will have [product].

You need to change your conditional to use [product] instead of [fields][product] or you can change your filebeat.yml and set fields_under_root to false.

Thank you. Yeah it worked.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.