Logstash monitoring is not shown in kibana monitoring UI

Hi @chrisronline
I have started the logstash using the below command
/usr/share/logstash$ sudo bin/logstash --path.settings /etc/logstash/ -f /home/siddaram094/sample-log.conf

but still the monitoring indices are getting create for logstash

these are the logs generated at the console

Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2019-05-21T16:29:03,622][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-05-21T16:29:03,687][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.0.1"}
[2019-05-21T16:29:16,982][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@10.160.0.5:9200/]}}
[2019-05-21T16:29:17,596][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@10.160.0.5:9200/"}
[2019-05-21T16:29:17,866][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-05-21T16:29:17,879][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2019-05-21T16:29:17,939][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//10.160.0.5:9200"]}
[2019-05-21T16:29:17,987][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-05-21T16:29:18,261][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_in
terval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}},
 {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@
timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}
, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-05-21T16:29:18,557][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-6.0.0-java/vendor/GeoLite2-City.m
mdb"}
[2019-05-21T16:29:19,270][INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inf
light"=>125, :thread=>"#<Thread:0x681b32e0 run>"}
[2019-05-21T16:29:20,276][INFO ][logstash.inputs.file     ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_8
636a19711465cc96926000984eb4005", :path=>["/var/log/apache2/access.log"]}
[2019-05-21T16:29:20,404][INFO ][logstash.javapipeline    ] Pipeline started {"pipeline.id"=>"main"}
[2019-05-21T16:29:20,628][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-05-21T16:29:20,689][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2019-05-21T16:29:21,775][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}