Monitoring Logstash with XPACK


(Denis) #1

Hi!

I follow the instruction about installing and setting up logstash monitoring. No errors.

I using cloud ES and Kibana.

When I navigate to Monitoring - nothing happends - "No Monitor Data Found", but when I go to Dev Tools - I found logstash monitoring index, and it size increases.

Why monitoring tabs didn't work? May be I delete some visualizations? What should I test?


(Christian Dahlqvist) #2

What version of the different components are you using?


(Denis) #3

Logstash 6.1.2


(Mark Walkom) #4

Can you show us your config, just remove things like URLs and passwords.

Please also show logs.


(Vladimir Smorodinov) #5

just check your timestamp field. Problem shoud be there.


(Denis) #6

Here is logstash config

node.name: test-node

xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.username: "elastic"
xpack.monitoring.elasticsearch.password: "%mypass%"
xpack.monitoring.elasticsearch.url: "%url%"

cloud.id: %myid%
cloud.auth: %login%:%myauth%

I think that monitoring data are in elasticsearch, beacuse I see monitoring indexes, and there size are increases, but I didn't know how to see monitoring data


(Denis) #7

And here is my pipeline config

input {
beats {
port => "5044"
}
}
filter {
grok {
match => ["message", "%{TIME:eventtime}\t%{WORD:process_id}\tGPU0 t=%{NUMBER:GPU0Temp:float}C fan=%{NUMBER:GPU0Fan:int}%"]
match => ["message", "%{TIME:eventtime}\t%{WORD:process_id}\t%{GREEDYDATA:currency} - Total Speed: %{NUMBER:TotalSpeed:float} Mh/s, Total Shares: %{NUMBER:TotalShares}, Rejected: %{NUMBER:RejectedShares}"]
}

date {

match => [ "eventtime", "HH:mm:ss:SSS" ]

}

if "_grokparsefailure" in [tags] {
  drop {}
}

}
output {
stdout { codec => rubydebug }
elasticsearch {
index => "gputest"
hosts => "%myhost%"
user => "elastic"
password => "%mypass%"
}
}


(Denis) #8

I register new account on elastic cloud and repeat setting up. The same problem


(Mark Walkom) #9

Please show your logs.


(Denis) #10

You mean logstash logs on starting?


(Mark Walkom) #11

Correct.


(Denis) #12

[2018-01-29T11:37:03,648][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"arcsight", :directory=>"%mypath%/logstash-6.1.2/vendor/bundle/jruby/2.3.0/gems/x-pack-6.1.2-java/modules/arcsight/configuration"}
[2018-01-29T11:37:06,552][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-01-29T11:37:10,887][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.1.2"}
[2018-01-29T11:37:12,951][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-01-29T11:37:22,505][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch hosts=>[%myhost%], bulk_path=>"/_xpack/monitoring/_bulk?system_id=logstash&system_api_version=2&interval=1s", manage_template=>false, document_type=>"%{[@metadata][document_type]}", sniffing=>false, user=>"elastic", password=>, id=>"%myid%", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"%myid%", enable_metric=>true, charset=>"UTF-8">, workers=>1, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2018-01-29T11:37:24,960][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@%myURL%/]}}
[2018-01-29T11:37:25,044][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://elastic:xxxxxx@%myURL%:9243/, :path=>"/"}
[2018-01-29T11:37:27,172][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@%myURL%:9243/"}
[2018-01-29T11:37:27,797][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>nil}
[2018-01-29T11:37:27,810][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-01-29T11:37:27,947][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://@%myURL%:9243"]}
[2018-01-29T11:37:28,037][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>2, :thread=>"#<Thread:0x12397dea run>"}
[2018-01-29T11:37:28,528][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@%myURL%:9243/]}}
[2018-01-29T11:37:28,539][INFO ][logstash.licensechecker.licensereader] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://elastic:xxxxxx@%myURL%:9243/, :path=>"/"}
[2018-01-29T11:37:29,022][WARN ][logstash.licensechecker.licensereader] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@%myURL%:9243/"}


(Denis) #13

[2018-01-29T11:37:29,760][INFO ][logstash.licensechecker.licensereader] ES Output version determined {:es_version=>nil}
[2018-01-29T11:37:29,768][WARN ][logstash.licensechecker.licensereader] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-01-29T11:37:30,559][INFO ][logstash.pipeline ] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
[2018-01-29T11:37:53,004][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@%myURL%:9243/]}}
[2018-01-29T11:37:53,022][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://elastic:xxxxxx@%myURL%:9243/, :path=>"/"}
[2018-01-29T11:37:53,594][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@%myURL%:9243/"}
[2018-01-29T11:37:53,909][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>nil}
[2018-01-29T11:37:53,911][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-01-29T11:37:53,917][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-01-29T11:37:53,970][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-01-29T11:37:54,186][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash
[2018-01-29T11:37:54,712][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://%myURL%:9243"]}
[2018-01-29T11:37:57,666][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125, :thread=>"#<Thread:0x3e273668 run>"}
[2018-01-29T11:37:59,389][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2018-01-29T11:37:59,662][INFO ][logstash.pipeline ] Pipeline started {"pipeline.id"=>"main"}
[2018-01-29T11:37:59,910][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2018-01-29T11:38:00,505][INFO ][logstash.agent ] Pipelines running {:count=>2, :pipelines=>[".monitoring-logstash", "main"]}
[2018-01-29T11:38:00,803][INFO ][logstash.inputs.metrics ] Monitoring License OK
[2018-01-29T11:38:00,804][INFO ][logstash.inputs.metrics ] Monitoring License OK


(Mark Walkom) #14

Everything looks fine.

What does _cat/indices?v show? Is there anything in the Elasticsearch logs?


(Denis) #15

green open .monitoring-logstash-6-2018.02.06 sxTKr0EvRIm8nX29aviGqQ 1 1 46592 0 17.6mb 11.8mb


(Mark Walkom) #16

Then there's definitely data in there, are you still seeing nothing?


(Denis) #18


(system) #19

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.