N2disk uburst log parsing

Could use some advice> Having difficulty with a fresh install (6.3). Built a conf file with a grok filter that checked out fine in debugger, but does not appear to be parsed by logstash/kibana. We see our log information in the message field only.

Also tried the sample apache conf setup, same result, Kibana shows the test data strings access_log but only in the message section.

Must be something fundamental in the configuration that is overriding my conf file filter. ELK is new to me.

My conf file setup:
###################
/etc/logstash/conf.d/uburst.conf
input {
file {
path => "/var/tmp/n2disk/uburst.log"
start_position => "beginning"
}
}
filter {
if [path] == "/var/tmp/n2disk/" {
grok {
match => { "message" => "%{NUMBER:start} %{NUMBER:end} %{NUMBER:duration} %{NUMBER:kbps} %{NUMBER:PeakMbps}" }
remove_field => ["message"]
}
}
}

output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
#####################

The goal is to pull in the 5 field values below, graph the peak mbit/s rate over time etc.

We are reading a file from localhost /var/tmp/n2disk/uburst.log, the tab separated file is as follows.

#############
Start End Duration Kbit Peak-Mbit/s
1477408107.351533705 1477408107.366453800 0.014920095 1378 95.971
1477408110.330377529 1477408110.341709397 0.011331868 1058 94.741
#############

The Kibana discover messages shows the following:

@timestamp July 23rd 2018, 16:16:36.766
t _id AWTIyV_kAC53BgREZmRx
t _index filebeat-2018.07.23

_score -

t _type doc
t beat.hostname ubuntu-18
t beat.name ubuntu-18
t beat.version 5.6.10
t input_type log
t message 1532376992.306766204 1532376992.336646903 0.029880699 2788166 100000.192

offset 3,476

t source /var/tmp/n2disk/uburst.log
t type log

What does the stdout output show?

hi Mark, nothing untoward that I can pickup.

############################
cat /var/log/logstash/logstash-plain.log

[2018-07-25T15:20:00,641][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-07-25T15:20:01,726][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-07-25T15:20:02,338][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-07-25T15:20:02,371][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-07-25T15:20:02,437][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-07-25T15:20:02,998][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125}
[2018-07-25T15:20:05,156][INFO ][logstash.pipeline ] Pipeline main started
[2018-07-25T15:20:05,439][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
admin@ubuntu-18:~$
############################

It should show the documents being processed though, so perhaps it's a sincedb issue. Try using cat + a stdin to process things.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.