Good afternoon friends, I have a question, I have Kibana version 7.5.1, the same as Elastic. I created an index to monitor a weblogic log.
Before creating the index I tested the logstash configuration file (v 7.1.1), which contains the input log, the grok filter, and the output to the terminal and everything worked perfectly, I get the log fields in json format in the terminal.
Once this was verified, I created the index in Kibana, with the mapping that corresponds to the fields I want to show. After that, I pointed the logstash to Kibana, but the index shows no signs of data traffic. It does not receive any data. Could you help me decipher this mystery, please?
Attached are some details:
Logstash output in terminal, initial test:
$ ./logstash -r
Sending Logstash logs to /u01/home/app/prd12c/ELK/logstash-7.1.1/logs which is now configured via log4j2.properties
[2024-08-03T18:00:17,675][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2024-08-03T18:00:35,382][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@phineas.falabella.cl:9200/]}}
[2024-08-03T18:00:36,096][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@phineas.falabella.cl:9200/"}
[2024-08-03T18:00:36,195][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2024-08-03T18:00:36,201][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2024-08-03T18:00:36,287][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//phineas.falabella.cl:9200"]}
[2024-08-03T18:00:36,356][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2024-08-03T18:00:36,686][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2024-08-03T18:00:37,116][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"weblogic", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, :thread=>"#<Thread:0x5376db86 run>"}
[2024-08-03T18:00:38,084][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"weblogic"}
[2024-08-03T18:00:38,358][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:weblogic], :non_running_pipelines=>[]}
[2024-08-03T18:00:38,375][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2024-08-03T18:00:39,511][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
/u01/home/app/prd12c/ELK/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnum is deprecated
{
"servername" => "producerTotCorpQa01",
"timestamp" => "1722692649707",
"log_level" => "Info",
"thread" => "WorkManager",
"timer" => "Timer-2",
"path" => "/u01/home/app/mdw/producer/domains/producerTotCorpQa/servers/producerTotCorpQa01/logs/producerTotCorpQa01.log",
"uuid" => "c955ee6e-2f27-499b-9857-283b9dcb4908-0000000b",
"log_message" => "Self-tuning thread pool contains 1 running threads, 1 idle threads, and 32 standby threads",
"message" => "####<Aug 3, 2024 9:44:09,707 AM CLT> <Info> <WorkManager> <f8cloud5032> <producerTotCorpQa01> <Timer-2> <<WLS Kernel>> <> <c955ee6e-2f27-499b-9857-283b9dcb4908-0000000b> <1722692649707> <[severity-value: 64] [rid: 0] [partition-id: 0] [partition-name: DOMAIN] > <BEA-002959> <Self-tuning thread pool contains 1 running threads, 1 idle threads, and 32 standby threads> ",
"kernel" => "WLS Kernel",
"misc" => "[severity-value: 64] [rid: 0] [partition-id: 0] [partition-name: DOMAIN] ",
"log_timestamp" => "Aug 3, 2024 9:44:09,707 AM CLT",
"log_number" => "BEA-002959",
"hostname" => "f8cloud5032"
}
{
"servername" => "producerTotCorpQa01",
"timestamp" => "1722692769714",
"log_level" => "Info",
"thread" => "WorkManager",
"timer" => "Timer-2",
"path" => "/u01/home/app/mdw/producer/domains/producerTotCorpQa/servers/producerTotCorpQa01/logs/producerTotCorpQa01.log",
"uuid" => "c955ee6e-2f27-499b-9857-283b9dcb4908-0000000b",
"log_message" => "Self-tuning thread pool contains 1 running threads, 1 idle threads, and 32 standby threads",
"message" => "####<Aug 3, 2024 9:46:09,714 AM CLT> <Info> <WorkManager> <f8cloud5032> <producerTotCorpQa01> <Timer-2> <<WLS Kernel>> <> <c955ee6e-2f27-499b-9857-283b9dcb4908-0000000b> <1722692769714> <[severity-value: 64] [rid: 0] [partition-id: 0] [partition-name: DOMAIN] > <BEA-002959> <Self-tuning thread pool contains 1 running threads, 1 idle threads, and 32 standby threads> ",
"kernel" => "WLS Kernel",
"misc" => "[severity-value: 64] [rid: 0] [partition-id: 0] [partition-name: DOMAIN] ",
"log_timestamp" => "Aug 3, 2024 9:46:09,714 AM CLT",
"log_number" => "BEA-002959",
"hostname" => "f8cloud5032"
}
Index creation:
curl -X PUT http://localhost:9200/weblogic -u user:passwd
curl -H 'Content-Type: application/json' -X PUT "http://localhost:9200/weblogic/er911/_mapping?include_type_name" -u user:passwd -d '{
"er911": {
"properties": {
"@timestamp": {
"type": "date"
},
"@version": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"log_timestamp": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"log_level": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"thread": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"hostname": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"servername": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"timer": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"kernel": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"Data": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"uuid": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"timestamp": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"misc": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"log_number": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"log_message": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
}'
Index data:
The "Docs count" field remains at zero, even though logstash shows the data it is sending
From Kibana:
From CURL:
Another Index that work, with data:
$ curl -X GET localhost:9200/_cat/indices/nginxoms-f1cloud5051 -u user:passwd
yellow open nginxoms-f1cloud5051 wkQeonEhTEuIfui7m8mqpw 1 1 21006666 0 1.9gb 1.9gb
$
Mi Index, that don't work, without data:
$ curl -X GET localhost:9200/_cat/indices/weblogic -u user:passwd
yellow open weblogic RLJByxakRJqmRVJeAo3hTA 1 1 0 0 283b 283b
$
Logstash configuration file:
input {
file {
path => "/u01/home/app/mdw/producer/domains/producerTotCorpQa/servers/producerTotCorpQa01/logs/producerTotCorpQa01.log"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
grok {
match=>{"message"=>["<%{DATA:log_timestamp}> <%{WORD:log_level}> <%{WORD:thread}> <%{HOSTNAME:hostname}> <%{HOSTNAME:servername}> <%{DATA:timer}> <<%{DATA:kernel}>> <> <%{DATA:uuid}> <%{NUMBER:timestamp}> <%{DATA:misc}> <%{DATA:log_number}> <%{DATA:log_message}>"]
remove_field => ["message"]
}
}
mutate {
remove_field => ["offset", "prospector","@version","source","host","[beat][hostname]","[beat][name]","[beat][version]","@timestamp","input","beat","log"]
}
}
output {
if [verb] == "POST" or [verb] == "PUT" {
elasticsearch {
hosts => ["phineas.falabella.cl:9200"]
index => "weblogic"
user => "user"
password => "passwd"
}
}
stdout { codec => rubydebug }
}
Logstash pipeline:
$ cat config/pipelines.yml
- pipeline.id: weblogic
path.config: "/u01/home/app/prd12c/ELK/logstash-7.1.1/scripts/producerTotCorpQa.conf"
$