Logstash does not load data into elastic

Good afternoon friends, I have a question, I have Kibana version 7.5.1, the same as Elastic. I created an index to monitor a weblogic log.
Before creating the index I tested the logstash configuration file (v 7.1.1), which contains the input log, the grok filter, and the output to the terminal and everything worked perfectly, I get the log fields in json format in the terminal.

Once this was verified, I created the index in Kibana, with the mapping that corresponds to the fields I want to show. After that, I pointed the logstash to Kibana, but the index shows no signs of data traffic. It does not receive any data. Could you help me decipher this mystery, please?
Attached are some details:

Logstash output in terminal, initial test:

$ ./logstash -r
Sending Logstash logs to /u01/home/app/prd12c/ELK/logstash-7.1.1/logs which is now configured via log4j2.properties
[2024-08-03T18:00:17,675][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2024-08-03T18:00:35,382][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@phineas.falabella.cl:9200/]}}
[2024-08-03T18:00:36,096][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@phineas.falabella.cl:9200/"}
[2024-08-03T18:00:36,195][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2024-08-03T18:00:36,201][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2024-08-03T18:00:36,287][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//phineas.falabella.cl:9200"]}
[2024-08-03T18:00:36,356][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2024-08-03T18:00:36,686][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2024-08-03T18:00:37,116][INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"weblogic", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, :thread=>"#<Thread:0x5376db86 run>"}
[2024-08-03T18:00:38,084][INFO ][logstash.javapipeline    ] Pipeline started {"pipeline.id"=>"weblogic"}
[2024-08-03T18:00:38,358][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:weblogic], :non_running_pipelines=>[]}
[2024-08-03T18:00:38,375][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2024-08-03T18:00:39,511][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
/u01/home/app/prd12c/ELK/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnum is deprecated
{
       "servername" => "producerTotCorpQa01",
        "timestamp" => "1722692649707",
        "log_level" => "Info",
           "thread" => "WorkManager",
            "timer" => "Timer-2",
             "path" => "/u01/home/app/mdw/producer/domains/producerTotCorpQa/servers/producerTotCorpQa01/logs/producerTotCorpQa01.log",
             "uuid" => "c955ee6e-2f27-499b-9857-283b9dcb4908-0000000b",
      "log_message" => "Self-tuning thread pool contains 1 running threads, 1 idle threads, and 32 standby threads",
          "message" => "####<Aug 3, 2024 9:44:09,707 AM CLT> <Info> <WorkManager> <f8cloud5032> <producerTotCorpQa01> <Timer-2> <<WLS Kernel>> <> <c955ee6e-2f27-499b-9857-283b9dcb4908-0000000b> <1722692649707> <[severity-value: 64] [rid: 0] [partition-id: 0] [partition-name: DOMAIN] > <BEA-002959> <Self-tuning thread pool contains 1 running threads, 1 idle threads, and 32 standby threads> ",
           "kernel" => "WLS Kernel",
             "misc" => "[severity-value: 64] [rid: 0] [partition-id: 0] [partition-name: DOMAIN] ",
    "log_timestamp" => "Aug 3, 2024 9:44:09,707 AM CLT",
       "log_number" => "BEA-002959",
         "hostname" => "f8cloud5032"
}
{
       "servername" => "producerTotCorpQa01",
        "timestamp" => "1722692769714",
        "log_level" => "Info",
           "thread" => "WorkManager",
            "timer" => "Timer-2",
             "path" => "/u01/home/app/mdw/producer/domains/producerTotCorpQa/servers/producerTotCorpQa01/logs/producerTotCorpQa01.log",
             "uuid" => "c955ee6e-2f27-499b-9857-283b9dcb4908-0000000b",
      "log_message" => "Self-tuning thread pool contains 1 running threads, 1 idle threads, and 32 standby threads",
          "message" => "####<Aug 3, 2024 9:46:09,714 AM CLT> <Info> <WorkManager> <f8cloud5032> <producerTotCorpQa01> <Timer-2> <<WLS Kernel>> <> <c955ee6e-2f27-499b-9857-283b9dcb4908-0000000b> <1722692769714> <[severity-value: 64] [rid: 0] [partition-id: 0] [partition-name: DOMAIN] > <BEA-002959> <Self-tuning thread pool contains 1 running threads, 1 idle threads, and 32 standby threads> ",
           "kernel" => "WLS Kernel",
             "misc" => "[severity-value: 64] [rid: 0] [partition-id: 0] [partition-name: DOMAIN] ",
    "log_timestamp" => "Aug 3, 2024 9:46:09,714 AM CLT",
       "log_number" => "BEA-002959",
         "hostname" => "f8cloud5032"
}

Index creation:

curl -X PUT http://localhost:9200/weblogic -u user:passwd

curl -H 'Content-Type: application/json' -X PUT "http://localhost:9200/weblogic/er911/_mapping?include_type_name" -u user:passwd -d '{

     "er911": {
         "properties": {
                          "@timestamp": {
                                "type": "date"
                          },
                          "@version": {
                                "type": "text",
                                "fields": {
                                  "keyword": {
                                        "type": "keyword",
                                        "ignore_above": 256
                                  }
                                }
                          },
                          "log_timestamp": {
                                "type": "text",
                                "fields": {
                                  "keyword": {
                                        "type": "keyword",
                                        "ignore_above": 256
                                  }
                                }
                          },
                          "log_level": {
                                "type": "text",
                                "fields": {
                                  "keyword": {
                                        "type": "keyword",
                                        "ignore_above": 256
                                  }
                                }
                          },
                          "thread": {
                                "type": "text",
                                "fields": {
                                  "keyword": {
                                        "type": "keyword",
                                        "ignore_above": 256
                                  }
                                }
                          },
                          "hostname": {
                                "type": "text",
                                "fields": {
                                  "keyword": {
                                        "type": "keyword",
                                        "ignore_above": 256
                                  }
                                }
                          },
                          "servername": {
                                "type": "text",
                                "fields": {
                                  "keyword": {
                                        "type": "keyword",
                                        "ignore_above": 256
                                  }
                                }
                          },
                          "timer": {
                                "type": "text",
                                "fields": {
                                  "keyword": {
                                        "type": "keyword",
                                        "ignore_above": 256
                                  }
                                }
                          },
                          "kernel": {
                                "type": "text",
                                "fields": {
                                  "keyword": {
                                        "type": "keyword",
                                        "ignore_above": 256
                                  }
                                }
                          },
                          "Data": {
                                "type": "text",
                                "fields": {
                                  "keyword": {
                                        "type": "keyword",
                                        "ignore_above": 256
                                  }
                                }
                          },
                          "uuid": {
                                "type": "text",
                                "fields": {
                                  "keyword": {
                                        "type": "keyword",
                                        "ignore_above": 256
                                  }
                                }
                          },
                          "timestamp": {
                                "type": "text",
                                "fields": {
                                  "keyword": {
                                        "type": "keyword",
                                        "ignore_above": 256
                                  }
                                }
                          },
                          "misc": {
                                "type": "text",
                                "fields": {
                                  "keyword": {
                                        "type": "keyword",
                                        "ignore_above": 256
                                  }
                                }
                          },
                          "log_number": {
                                "type": "text",
                                "fields": {
                                  "keyword": {
                                        "type": "keyword",
                                        "ignore_above": 256
                                  }
                                }
                          },
                          "log_message": {
                                "type": "text",
                                "fields": {
                                  "keyword": {
                                        "type": "keyword",
                                        "ignore_above": 256
                                  }
                                }
                          }
                }
 }
}'

Index data:

The "Docs count" field remains at zero, even though logstash shows the data it is sending

From Kibana:

From CURL:

Another Index that work, with data:

$ curl -X GET localhost:9200/_cat/indices/nginxoms-f1cloud5051 -u user:passwd
yellow open nginxoms-f1cloud5051 wkQeonEhTEuIfui7m8mqpw 1 1 21006666 0 1.9gb 1.9gb
$

Mi Index, that don't work, without data:
$ curl -X GET localhost:9200/_cat/indices/weblogic -u user:passwd
yellow open weblogic RLJByxakRJqmRVJeAo3hTA 1 1 0 0 283b 283b
$

Logstash configuration file:

input {
  file {
    path => "/u01/home/app/mdw/producer/domains/producerTotCorpQa/servers/producerTotCorpQa01/logs/producerTotCorpQa01.log"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}

filter {

   grok {
   match=>{"message"=>["<%{DATA:log_timestamp}> <%{WORD:log_level}> <%{WORD:thread}> <%{HOSTNAME:hostname}> <%{HOSTNAME:servername}> <%{DATA:timer}> <<%{DATA:kernel}>> <> <%{DATA:uuid}> <%{NUMBER:timestamp}> <%{DATA:misc}> <%{DATA:log_number}> <%{DATA:log_message}>"]
   remove_field => ["message"]
   }
}

  mutate {
        remove_field => ["offset", "prospector","@version","source","host","[beat][hostname]","[beat][name]","[beat][version]","@timestamp","input","beat","log"]
      }

}

output {
  if [verb] == "POST" or [verb] == "PUT" {
      elasticsearch {
        hosts => ["phineas.falabella.cl:9200"]
        index => "weblogic"
        user => "user"
        password => "passwd"
      }
  }
  stdout { codec => rubydebug }
}

Logstash pipeline:

$ cat config/pipelines.yml

- pipeline.id: weblogic
  path.config: "/u01/home/app/prd12c/ELK/logstash-7.1.1/scripts/producerTotCorpQa.conf"

$

Where is the verb field in your document? The two examples that you shared does not have a field named verb, so this conditional will never match and nothing will be sent to Elasticsearch.

Have you tried to remove this conditional?

Thank you very much Leandro for your answer, I really appreciate it. I got this configuration code and modified it, and it was my mistake that line with the "if". I already removed it, and ran the logstash again, but now it gave me this error. Sorry for my ignorance, but I'm new to this Elastic thing and it's not clear to me why it's throwing the following error. Again, I reiterate that I really appreciate your help.

Logstash file config

input {
  file {
    path => "/u01/home/app/mdw/producer/domains/producerTotCorpQa/servers/producerTotCorpQa01/logs/producerTotCorpQa01.log"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}

filter {

   grok {
   match=>{"message"=>["<%{DATA:log_timestamp}> <%{WORD:log_level}> <%{WORD:thread}> <%{HOSTNAME:hostname}> <%{HOSTNAME:servername}> <%{DATA:timer}> <<%{DATA:kernel}>> <> <%{DATA:uuid}> <%{NUMBER:timestamp}> <%{DATA:misc}> <%{DATA:log_number}> <%{DATA:log_message}>"]
   remove_field => ["message"]
   }
}

  mutate {
        remove_field => ["offset", "prospector","@version","source","host","[beat][hostname]","[beat][name]","[beat][version]","@timestamp","input","beat","log"]
      }

}

output {
      elasticsearch
        hosts => ["phineas.falabella.cl:9200"]
        index => "weblogic"
        user => "user"
        password => "passwd"
  }
  stdout { codec => rubydebug }
}

Output error

$ ./logstash -r
Sending Logstash logs to /u01/home/app/prd12c/ELK/logstash-7.1.1/logs which is now configured via log4j2.properties
[2024-08-05T10:17:50,363][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2024-08-05T10:17:54,173][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:weblogic, :exception=>"LogStash::ConfigurationE  rror", :message=>"Expected one of #, { at line 26, column 9 (byte 786) after output {\n      elasticsearch \n        ", :backtrace=>["/u01/home/app/prd12c/ELK/logstash-7.1.1/logstash-core  /lib/logstash/compiler.rb:41:in `compile_imperative'", "/u01/home/app/prd12c/ELK/logstash-7.1.1/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/u01/home/app/prd12c/ELK/lo  gstash-7.1.1/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2577:in `map'", "/u01/home/app/prd12c/ELK/logstash-7.1.1/logstash-core/lib  /logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:151:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in `initialize'"  , "/u01/home/app/prd12c/ELK/logstash-7.1.1/logstash-core/lib/logstash/java_pipeline.rb:23:in `initialize'", "/u01/home/app/prd12c/ELK/logstash-7.1.1/logstash-core/lib/logstash/pipeline_ac  tion/create.rb:36:in `execute'", "/u01/home/app/prd12c/ELK/logstash-7.1.1/logstash-core/lib/logstash/agent.rb:325:in `block in converge_state'"]}
[2024-08-05T10:17:54,834][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2024-08-05T10:17:57,669][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:weblogic, :exception=>"LogStash::ConfigurationE  rror", :message=>"Expected one of #, { at line 26, column 9 (byte 786) after output {\n      elasticsearch \n        ", :backtrace=>["/u01/home/app/prd12c/ELK/logstash-7.1.1/logstash-core  /lib/logstash/compiler.rb:41:in `compile_imperative'", "/u01/home/app/prd12c/ELK/logstash-7.1.1/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/u01/home/app/prd12c/ELK/lo  gstash-7.1.1/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2577:in `map'", "/u01/home/app/prd12c/ELK/logstash-7.1.1/logstash-core/lib  /logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:151:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in `initialize'"  , "/u01/home/app/prd12c/ELK/logstash-7.1.1/logstash-core/lib/logstash/java_pipeline.rb:23:in `initialize'", "/u01/home/app/prd12c/ELK/logstash-7.1.1/logstash-core/lib/logstash/pipeline_ac  tion/create.rb:36:in `execute'", "/u01/home/app/prd12c/ELK/logstash-7.1.1/logstash-core/lib/logstash/agent.rb:325:in `block in converge_state'"]}

And this is a sample of the log file:

####<Jul 22, 2024 11:00:40,942 PM CLT> <Info> <Diagnostics> <f8cloud5028> <producerOMSCLQa01> <[ACTIVE] ExecuteThread: '42' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <998d6618-ec1a-454f-ab77-a5d8ae11ebf5-0000c5a7> <1721703640942> <[severity-value: 64] [rid: 0] [partition-id: 0] [partition-name: DOMAIN] > <BEA-320143> <Scheduled 1 data retirement tasks as per configuration.>

You are missing the curly bracket. Change to:

output {
      elasticsearch {
...

Houff, sorry, I'll correct them

I corrected the configuration file. I ran logstash again, it seems to be running fine, but there is still no data in Kibana. I checked the elastic log to see if it showed any error, but I don't see anything like that. I don't know where else I could check because the data is not reaching Kibana :pensive:

Logstash:

$ ./logstash -r
Sending Logstash logs to /u01/home/app/prd12c/ELK/logstash-7.1.1/logs which is now configured via log4j2.properties
[2024-08-05T12:07:44,418][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2024-08-05T12:07:59,392][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@phineas.falabella.cl:9200/]}}
[2024-08-05T12:08:00,177][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@phineas.falabella.cl:9200/"}
[2024-08-05T12:08:00,317][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2024-08-05T12:08:00,344][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2024-08-05T12:08:00,410][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//phineas.falabella.cl:9200"]}
[2024-08-05T12:08:00,452][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2024-08-05T12:08:00,778][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2024-08-05T12:08:01,041][INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"weblogic", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, :thread=>"#<Thread:0x4d9f19d8 run>"}
[2024-08-05T12:08:01,911][INFO ][logstash.javapipeline    ] Pipeline started {"pipeline.id"=>"weblogic"}
[2024-08-05T12:08:02,087][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:weblogic], :non_running_pipelines=>[]}
[2024-08-05T12:08:02,065][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2024-08-05T12:08:03,127][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
/u01/home/app/prd12c/ELK/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnum is deprecated
{
        "timestamp" => "1722865339891",
            "timer" => "Timer-2",
           "thread" => "WorkManager",
      "log_message" => "Self-tuning thread pool contains 1 running threads, 1 idle threads, and 32 standby threads",
          "message" => "####<Aug 5, 2024 9:42:19,891 AM CLT> <Info> <WorkManager> <f8cloud5032> <producerTotCorpQa01> <Timer-2> <<WLS Kernel>> <> <c955ee6e-2f27-499b-9857-283b9dcb4908-0000000b> <1722865339891> <[severity-value: 64] [rid: 0] [partition-id: 0] [partition-name: DOMAIN] > <BEA-002959> <Self-tuning thread pool contains 1 running threads, 1 idle threads, and 32 standby threads> ",
       "servername" => "producerTotCorpQa01",
    "log_timestamp" => "Aug 5, 2024 9:42:19,891 AM CLT",
         "hostname" => "f8cloud5032",
           "kernel" => "WLS Kernel",
       "log_number" => "BEA-002959",
        "log_level" => "Info",
             "path" => "/u01/home/app/mdw/producer/domains/producerTotCorpQa/servers/producerTotCorpQa01/logs/producerTotCorpQa01.log",
             "uuid" => "c955ee6e-2f27-499b-9857-283b9dcb4908-0000000b",
             "misc" => "[severity-value: 64] [rid: 0] [partition-id: 0] [partition-name: DOMAIN] "
}
{
        "timestamp" => "1722865459897",
            "timer" => "Timer-2",
           "thread" => "WorkManager",
      "log_message" => "Self-tuning thread pool contains 1 running threads, 1 idle threads, and 32 standby threads",
          "message" => "####<Aug 5, 2024 9:44:19,897 AM CLT> <Info> <WorkManager> <f8cloud5032> <producerTotCorpQa01> <Timer-2> <<WLS Kernel>> <> <c955ee6e-2f27-499b-9857-283b9dcb4908-0000000b> <1722865459897> <[severity-value: 64] [rid: 0] [partition-id: 0] [partition-name: DOMAIN] > <BEA-002959> <Self-tuning thread pool contains 1 running threads, 1 idle threads, and 32 standby threads> ",
       "servername" => "producerTotCorpQa01",
    "log_timestamp" => "Aug 5, 2024 9:44:19,897 AM CLT",
         "hostname" => "f8cloud5032",
           "kernel" => "WLS Kernel",
       "log_number" => "BEA-002959",
        "log_level" => "Info",
             "path" => "/u01/home/app/mdw/producer/domains/producerTotCorpQa/servers/producerTotCorpQa01/logs/producerTotCorpQa01.log",
             "uuid" => "c955ee6e-2f27-499b-9857-283b9dcb4908-0000000b",
             "misc" => "[severity-value: 64] [rid: 0] [partition-id: 0] [partition-name: DOMAIN] "
}

Elastic logs:

[phineas]% grep weblogic *
elasticsearch.log:[2024-08-05T11:46:38,872][INFO ][o.e.c.m.MetaDataMappingService] [phineas] [weblogic/RLJByxakRJqmRVJeAo3hTA] update_mapping [er911]
elasticsearch_server.json:{"type": "server", "timestamp": "2024-08-05T11:46:38,872-04:00", "level": "INFO", "component": "o.e.c.m.MetaDataMappingService", "cluster.name": "elasticsearch", "node.name": "phineas", "message": "[weblogic/RLJByxakRJqmRVJeAo3hTA] update_mapping [er911]", "cluster.uuid": "_NBQnUVfRrWxhP6Wxv5Gkg", "node.id": "3trIwRlhRX-9-J-UATTLug"  }
prd12c [phineas]%

elasticsearch.log:

[2024-08-05T11:46:38,391][DEBUG][o.e.a.s.m.TransportMasterNodeAction] [phineas] Get stats for datafeed '_all'
[2024-08-05T11:46:38,872][INFO ][o.e.c.m.MetaDataMappingService] [phineas] [weblogic/RLJByxakRJqmRVJeAo3hTA] update_mapping [er911]
[2024-08-05T11:46:48,386][DEBUG][o.e.a.s.m.TransportMasterNodeAction] [phineas] Get stats for datafeed '_all'


elasticsearch_server.json:

{"type": "server", "timestamp": "2024-08-05T11:46:38,872-04:00", "level": "INFO", "component": "o.e.c.m.MetaDataMappingService", "cluster.name": "elasticsearch", "node.name": "phineas", "message": "[weblogic/RLJByxakRJqmRVJeAo3hTA] update_mapping [er911]", "cluster.uuid": "_NBQnUVfRrWxhP6Wxv5Gkg", "node.id": "3trIwRlhRX-9-J-UATTLug"  }

Kibana:

According to the LS log, data has been parsed.

Please check the elastic log once again and do grep on ERROR.

It might be, your data end up in logstash index or more likely data mapping has errors.

Thank you Rios, The truth is that if I search for the word ERROR in the Elastic logs, I don't get any information related to the "weblogic" index, which I'm interested in. I don't know if anyone knows if there is a page that translates the grok filter to a json mapping, or something similar, because I don't know how to identify which field(s) are causing the problem.

[phineas]% grep ERROR * |tail -5
elasticsearch-2023-09-06-1.log:[2023-09-06T23:59:19,365][ERROR][o.e.x.m.c.n.NodeStatsCollector] [phineas] collector [node_stats] failed to collect data
elasticsearch-2023-09-06-1.log:[2023-09-06T23:59:29,368][ERROR][o.e.x.m.c.n.NodeStatsCollector] [phineas] collector [node_stats] failed to collect data
elasticsearch-2023-09-06-1.log:[2023-09-06T23:59:39,366][ERROR][o.e.x.m.c.n.NodeStatsCollector] [phineas] collector [node_stats] failed to collect data
elasticsearch-2023-09-06-1.log:[2023-09-06T23:59:49,368][ERROR][o.e.x.m.c.n.NodeStatsCollector] [phineas] collector [node_stats] failed to collect data
elasticsearch-2023-09-06-1.log:[2023-09-06T23:59:59,368][ERROR][o.e.x.m.c.n.NodeStatsCollector] [phineas] collector [node_stats] failed to collect data
[phineas]%

I have no ERROR in the elastic logs, respect to de "weblogic" index. I think that the problem is with the fields. I´ll delete the Index and I'll create another index with the exact fields that the logstash spit, and I´ll see.

{
       "servername" => "producerTotCorpQa01",
        "timestamp" => "1722692649707",
        "log_level" => "Info",
           "thread" => "WorkManager",
            "timer" => "Timer-2",
             "path" => "/u01/home/app/mdw/producer/domains/producerTotCorpQa/servers/producerTotCorpQa01/logs/producerTotCorpQa01.log",
             "uuid" => "c955ee6e-2f27-499b-9857-283b9dcb4908-0000000b",
      "log_message" => "Self-tuning thread pool contains 1 running threads, 1 idle threads, and 32 standby threads",
          "message" => "####<Aug 3, 2024 9:44:09,707 AM CLT> <Info> <WorkManager> <f8cloud5032> <producerTotCorpQa01> <Timer-2> <<WLS Kernel>> <> <c955ee6e-2f27-499b-9857-283b9dcb4908-0000000b> <1722692649707> <[severity-value: 64] [rid: 0] [partition-id: 0] [partition-name: DOMAIN] > <BEA-002959> <Self-tuning thread pool contains 1 running threads, 1 idle threads, and 32 standby threads> ",
           "kernel" => "WLS Kernel",
             "misc" => "[severity-value: 64] [rid: 0] [partition-id: 0] [partition-name: DOMAIN] ",
    "log_timestamp" => "Aug 3, 2024 9:44:09,707 AM CLT",
       "log_number" => "BEA-002959",
         "hostname" => "f8cloud5032"
}

Dear guys, I can finally load the data into Kibana, but now, Kibana doesn't show me the data, even though when I go to the Index Manager, it tells me that it has documents, it also lets me create the Index Pattern, with the index created, but in Discovery it doesn't show me any data. Anyway, I'll leave this last part in another topic. Thank you so much for all your support

Probably because you have deleted "@timestamp". Not sure what are you using as a datetime field. If the log_timestamp is your the datetime field, it should be converted by "date.