Failed to create monitoring event {:message=>"For path: http_address. Map keys: [:jvm, :os, :stats]", :error=>"LogStash::Instrument::MetricStore::MetricNotFoun

I am facing the below error while starting the Logstash. Please note I am using docker-compose

Below is the Logstash configuration.

logstash.yml

Default Logstash configuration from Logstash base image.

http.host: "0.0.0.0"
xpack.monitoring.elasticsearch.hosts: [ "http://elasticsearch:9200" ]

X-Pack security credentials

xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.username: elastic
xpack.monitoring.elasticsearch.password: changeme

elasticsearch.yml

Default Elasticsearch configuration from Elasticsearch base image.

cluster.name: "docker-cluster"
network.host: 0.0.0.0

X-Pack settings

see https://www.elastic.co/guide/en/elasticsearch/reference/current/setup-xpack.html

xpack.license.self_generated.type: trial
xpack.security.enabled: false
xpack.monitoring.collection.enabled: true

logstash.conf
Input
{
file {
path => "/logstash/pipeline/input.csv"
type => "ppp"
start_position => "beginning"
sincedb_path => "/dev/null"

    }

}

filter {

      if [type] == "ppp" {
    csv {

output {
if [type] == "ppp" {
elasticsearch {
hosts => "http://elasticsearch:9200"
index => "testindex"
document_type => "sch"
}
stdout {}

Welcome to our community! :smiley:

Are you having issues here?

Currently facing issue with Logstash

Please format your code/logs/config using the </> button, or markdown style back ticks. It helps to make things easy to read which helps us help you :slight_smile:

Please also don't post pictures of text, they are difficult to read, impossible to search and replicate (if it's code), and some people may not be even able to see them.

I am facing the below error while starting the Logstash. Please note I am using docker-compose

Logstash.yml
http.host: "0.0.0.0"
xpack.monitoring.elasticsearch.hosts: [ "http://elasticsearch:9200" ]

X-Pack security credentials
xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.username: elastic
xpack.monitoring.elasticsearch.password: changeme

    **elasticseach.yml**

## Default Elasticsearch configuration from Elasticsearch base image.

cluster.name: "docker-cluster"
network.host: 0.0.0.0

## X-Pack settings

## see https://www.elastic.co/guide/en/elasticsearch/reference/current/setup-xpack.html

xpack.license.self_generated.type: trial
xpack.security.enabled: false
xpack.monitoring.collection.enabled: true

**logstash.conf**
input {
file {
path => "./logstash/input.csv"
type => "ppp"
start_position => "beginning"
sincedb_path => "/dev/null"
ignore_order => "0"
}
}
filter {

      if [type] == "ppp" {
        csv {
               columns => [ "Year",  "Month", 




mutate {
        	replace => { "reportMonth" => "%{Month}-%{Year}" }
		}
		date {
   		 	match => [ "reportMonth", "MMM-YYYY", "ISO8601" ]
   		 	target => "reportMonth"
		}


	}

}


output {
	if [type] == "ppp" {
			elasticsearch {
				hosts => ["localhost:9200"]
				user => "elastic"
				password => "changeme"
				index => "testindex"
				document_type => "sch"
				}
			}
        stdout {codec => rubydebug}
	}

Logstash logs

OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.ext.openssl.SecurityHelper (file:/tmp/jruby-1/jruby4938547143818179365jopenssl.jar) to field java.security.MessageDigest.provider
WARNING: Please consider reporting this to the maintainers of org.jruby.ext.openssl.SecurityHelper
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2020-11-19T11:37:25,740][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.9.3", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM 11.0.8+10-LTS on 11.0.8+10-LTS +indy +jit [linux-x86_64]"}
[2020-11-19T11:37:25,793][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
[2020-11-19T11:37:25,806][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
[2020-11-19T11:37:26,336][INFO ][logstash.agent           ] No persistent UUID file found. Generating new UUID {:uuid=>"1007cb96-a993-464c-9710-c6d9136b09f4", :path=>"/usr/share/logstash/data/uuid"}
[2020-11-19T11:37:26,946][WARN ][deprecation.logstash.monitoringextension.pipelineregisterhook] Internal collectors option for Logstash monitoring is deprecated and targeted for removal in the next major version.
Please configure Metricbeat to monitor Logstash. Documentation can be found at: 
https://www.elastic.co/guide/en/logstash/current/monitoring-with-metricbeat.html
[2020-11-19T11:37:27,772][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@elasticsearch:9200/]}}
[2020-11-19T11:37:28,076][WARN ][logstash.licensechecker.licensereader] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@elasticsearch:9200/"}
[2020-11-19T11:37:28,155][INFO ][logstash.licensechecker.licensereader] ES Output version determined {:es_version=>7}
[2020-11-19T11:37:28,158][WARN ][logstash.licensechecker.licensereader] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-11-19T11:37:28,323][INFO ][logstash.monitoring.internalpipelinesource] Monitoring License OK
[2020-11-19T11:37:28,324][INFO ][logstash.monitoring.internalpipelinesource] Validated license for monitoring. Enabling monitoring pipeline.
[2020-11-19T11:37:29,582][INFO ][org.reflections.Reflections] Reflections took 44 ms to scan 1 urls, producing 22 keys and 45 values 
[2020-11-19T11:37:29,788][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@elasticsearch:9200/]}}
[2020-11-19T11:37:29,811][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@elasticsearch:9200/"}
[2020-11-19T11:37:29,822][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] ES Output version determined {:es_version=>7}
[2020-11-19T11:37:29,823][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-11-19T11:37:29,893][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearchMonitoring", :hosts=>["http://elasticsearch:9200"]}
[2020-11-19T11:37:29,917][WARN ][logstash.javapipeline    ][.monitoring-logstash] 'pipeline.ordered' is enabled and is likely less efficient, consider disabling if preserving event order is not necessary
[2020-11-19T11:37:30,005][INFO ][logstash.javapipeline    ][.monitoring-logstash] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, "pipeline.sources"=>["monitoring pipeline"], :thread=>"#<Thread:0x49e32b4f@/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:54 run>"}
[2020-11-19T11:37:30,904][INFO ][logstash.javapipeline    ][.monitoring-logstash] Pipeline Java execution initialization time {"seconds"=>0.9}
[2020-11-19T11:37:30,937][INFO ][logstash.javapipeline    ][.monitoring-logstash] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
[2020-11-19T11:37:34,090][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \\t\\r\\n], \"#\", \"input\", \"filter\", \"output\" at line 2, column 1 (byte 2) after ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in `compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:183:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:69:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:357:in `block in converge_state'"]}
[2020-11-19T11:37:35,039][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-11-19T11:37:36,007][INFO ][logstash.javapipeline    ][.monitoring-logstash] Pipeline terminated {"pipeline.id"=>".monitoring-logstash"}
[2020-11-19T11:37:36,991][INFO ][logstash.runner          ] Logstash shut down.
docker-elg_logstash_1 exited with code 0

I do not think you are running with the logstash.conf that you think you are.

Note that path => "./logstash/input.csv" the path option must be absolute -- you cannot use a relative path.

Note also that in filebeat setting ignore_older to zero disables age based filtering of files. Setting it to zero in logstash cause the input to ignore any file which has a timestamp more than zero seconds old, which means it ignores everything.

1 Like

As suggested by you I have changed the path to an absolute path, now Logstash is running but with the below error. I am using docker-compose. Could you please help with this?

      logstash_1       | [2020-11-21T09:50:00,936][ERROR][logstash.inputs.metrics  ] Failed to create monitoring event {:message=>"For path: http_address. Map keys: [:jvm, :stats, :os]", :error=>"LogStash::Instrument::MetricStore::MetricNotFound"}
logstash_1       | [2020-11-21T09:50:01,256][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
logstash_1       | [2020-11-21T09:50:01,865][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x76dc8da8 run>"}
logstash_1       | [2020-11-21T09:50:12,221][ERROR][logstash.inputs.metrics  ] Failed to create monitoring event {:message=>"For path: http_address. Map keys: [:jvm, :stats, :os]", :error=>"LogStash::Instrument::MetricStore::MetricNotFound"}
logstash_1       | [2020-11-21T09:50:21,873][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>19.98}
logstash_1       | [2020-11-21T09:50:22,263][ERROR][logstash.inputs.metrics  ] Failed to create monitoring event {:message=>"For path: http_address. Map keys: [:jvm, :stats, :os]", :error=>"LogStash::Instrument::MetricStore::MetricNotFound"}
logstash_1       | [2020-11-21T09:50:22,722][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
logstash_1       | [2020-11-21T09:50:22,927][INFO ][logstash.agent           ] Pipelines running {:count=>2, :running_pipelines=>[:".monitoring-logstash", :main], :non_running_pipelines=>[]}
logstash_1       | [2020-11-21T09:50:23,105][INFO ][filewatch.observingtail  ][main][0431188fdd933f9945cdd10eb34d275fd2e986d62dd6edbfb145dfea28cee6cb] START, creating Discoverer, Watch with file and sincedb collections
logstash_1       | [2020-11-21T09:50:27,215][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Now Logstash is working but its not pointing to custom logstash.conf file.