Failed to execute action : Logstash : Pipeline action create

hi,
have following setup :
CentOS 7 machine
Logstash version : 6.2.4
elasticsearch version : 6.2.4-1
kibana : 6.2.4

all the services (logstash , kibana and elasticsearch we working fine ) .
following file is a input file to logstash service :

input {
file {
path => "/opt/apache-tomcat-8.5.31/logs/petclinic.log"
type => "log"
add_field => [ "app-name", "petclinic"]
codec => multiline {
pattern => "^%{TIMESTAMP_ISO8601}"
negate => true
what => previous
}
}

jmx {
    path => "/etc/logstash/conf.d/"
    polling_frequency => 15
    nb_thread => 4
    type => "jmx"
    add_field => [ "app-name", "petclinic"]
}

filter {
if [type] == "log" {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:loglevel}\s? [%{JAVACLASS:class}] - %{GREEDYDATA:logmessage}" }
}
}

date {
    match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss,SSS" ]
}

}

output {
elasticsearch {
hosts => "http://localhost:9200/"
index => "pet"
}
}
}

and running the logstash command as follows :

logstash -f /etc/logstash/conf.d/logstash_input.conf --path.settings /etc/logstash/

but getting following error :

[2018-06-06T16:29:43,941][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2018-06-06T16:29:43,961][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2018-06-06T16:29:44,679][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.4"}
[2018-06-06T16:29:44,862][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-06-06T16:29:45,078][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, => at line 23, column 8 (byte 495) after input {\n file {\n path => "/opt/apache-tomcat-8.5.31/logs/petclinic.log"\n type => "log"\n add_field => [ "app-name", "petclinic"]\n codec => multiline {\n pattern => "^%{TIMESTAMP_ISO8601}"\n negate => true\n what => previous\n }\n }\n\n jmx {\n path => "/etc/logstash/conf.d/"\n polling_frequency => 15\n nb_thread => 4\n type => "jmx"\n add_field => [ "app-name", "petclinic"]\n }\n\n\nfilter {\n if ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:42:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:51:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:169:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:315:in `block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312:in `block in converge_state'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299:in `converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:90:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:348:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}

not getting what the issue is and which configuration is missing ...
please suggest

Regards,

Your config has

input {
    file {
    }
}
    jmx {
    }
filter {

which should be

input {
    file {
    }
    jmx {
    }
}
filter {

I am very surprised it does not complain earlier in the file.

1 Like

thanks. i made the change and it resolved the issue . however, getting following issue in the logstash log

it is related to jmx

[2018-06-06T17:56:34,025][ERROR][logstash.inputs.jmx ] Failed to retrieve RMIServer stub: javax.naming.CommunicationException [Root exception is java.rmi.ConnectIOException: non-JRMP server at remote endpoint]
[2018-06-06T17:56:34,022][WARN ][logstash.inputs.jmx ] Issue loading configuration from file {:file=>"/etc/logstash/conf.d/logstash_input.conf.copy", :exception=>"Unrecognized token 'input': was expecting ('true', 'false' or 'null')\n at [Source: (byte)"input {\n file {\n path => "/opt/apache-tomcat-8.5.31/logs/petclinic.log"\n type => "log"\n add_field => [ "app-name", "petclinic"]\n codec => multiline {\n pattern => "^%{TIMESTAMP_ISO8601}"\n negate => true\n what => previous\n }\n }\n\n jmx {\n path => "/etc/logstash/conf.d/"\n polling_frequency => 15\n nb_thread => 4\n type => "jmx"\n add_field => [ "app-name", "petclinic"]\n }\n\n\nfilter {\n if [type]"[truncated 377 bytes]; line: 1, column: 7]", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/json.rb:17:in jruby_load'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jmx-3.0.6/lib/logstash/inputs/jmx.rb:332:in block in run'", "org/jruby/RubyDir.java:383:in foreach'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jmx-3.0.6/lib/logstash/inputs/jmx.rb:326:in run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:514:in inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:507:in block in start_input'"]}
[2018-06-06T17:56:34,025][ERROR][logstash.inputs.jmx ] javax.management.remote.rmi.RMIConnector.connect(javax/management/remote/rmi/RMIConnector.java:369)
javax.management.remote.JMXConnectorFactory.connect(javax/management/remote/JMXConnectorFactory.java:270)
java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:498)
org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:468)
org.jruby.javasupport.JavaMethod.invokeStaticDirect(org/jruby/javasupport/JavaMethod.java:370)
RUBY.create_connection(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jmx4r-0.1.4/lib/jmx4r.rb:220)
RUBY.connection(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jmx4r-0.1.4/lib/jmx4r.rb:137)
RUBY.thread_jmx(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jmx-3.0.6/lib/logstash/inputs/jmx.rb:217)
RUBY.block in run(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jmx-3.0.6/lib/logstash/inputs/jmx.rb:321)
org.jruby.RubyProc.call(org/jruby/RubyProc.java:289)
org.jruby.RubyProc.call(org/jruby/RubyProc.java:246)
java.lang.Thread.run(java/lang/Thread.java:748)
[2018-06-06T17:56:34,029][WARN ][logstash.inputs.jmx ] Issue loading configuration from file {:file=>"/etc/logstash/conf.d/logstash_input.conf", :exception=>"Unrecognized token 'input': was expecting ('true', 'false' or 'null')\n at [Source: (byte)"input {\n file {\n path => "/opt/apache-tomcat-8.5.31/logs/petclinic.log"\n type => "log"\n add_field => [ "app-name", "petclinic"]\n codec => multiline {\n pattern => "^%{TIMESTAMP_ISO8601}"\n negate => true\n what => previous\n }\n }\n\n\n jmx {\n path => "/etc/logstash/conf.d/"\n polling_frequency => 15\n nb_thread => 4\n type => "jmx"\n add_field => [ "app-name", "petclinic"]\n }\n\n}\nfilter {\n gro"[truncated 347 bytes]; line: 1, column: 7]", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/json.rb:17:in jruby_load'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jmx-3.0.6/lib/logstash/inputs/jmx.rb:332:in block in run'", "org/jruby/RubyDir.java:383:in foreach'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jmx-3.0.6/lib/logstash/inputs/jmx.rb:326:in run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:514:in inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:507:in block in start_input'"]}

Tomcat is running on port 8080 .. do i need to configure jmx to listen on port 8080 also ?
i am not sure , thats the issue ?

please suggest

1 Like

You have told the jmx filter to read its configuration, which should be JSON, from that directory. It's reading your logstash configuration and failing to parse it. Put your logstash and jmx configurations in different directories.

thanks. i made following changes :

/etc/logstash/conf.d directory has following file :

/etc/logstash/conf.d/logstash_input.conf
which has following content :

input {
file {
path => "/opt/apache-tomcat-8.5.31/logs/petclinic.log"
type => "log"
add_field => [ "app-name", "petclinic"]
codec => multiline {
pattern => "^%{TIMESTAMP_ISO8601}"
negate => true
what => previous
}
}

jmx {
    path => "/etc/logstash/jmx/"
    polling_frequency => 15
    nb_thread => 4
    type => "jmx"
    add_field => [ "app-name", "petclinic"]
}

}
filter {
if [type] == "log" {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:loglevel}\s? [%{JAVACLASS:class}] - %{GREEDYDATA:logmessage}" }
}
}
date {
match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss,SSS" ]
}
}

output {
elasticsearch {
hosts => "http://localhost:9200/"
index => "pet"
}
}

created jmx directory here :

/etc/logstash/jmx
which has following two files :
-rw-r--r-- 1 root root 959 Jun 6 14:58 filea.json
-rw-r--r-- 1 root root 205 Jun 6 14:59 fileb.json

contents of filea.json


{
"host": "localhost",
"port": 8080,
"alias": "jvm",
"queries": [
{
"object_name": "java.lang:type=OperatingSystem",
"object_alias": "${type}"
}, {
"object_name": "java.lang:type=Memory",
"object_alias": "${type}"
}, {
"object_name": "java.lang:type=Runtime",
"attributes": [ "Uptime", "StartTime" ],
"object_alias": "${type}"
}, {
"object_name": "java.lang:type=GarbageCollector,name=",
"attributes": [ "CollectionCount", "CollectionTime" ],
"object_alias": "${type}.${name}"
}, {
"object_name": "java.lang:type=MemoryPool,name=
",
"attributes": [ "Usage", "PeakUsage" ],
"object_alias": "${type}.${name}"
}, {
"object_name": "java.nio:type=BufferPool,name=*",
"object_alias": "${type}.${name}"
}
]
}

contents of fileb.json

{
"host": "localhost",
"port": 8080,
"alias": "app",
"queries": [
{
"object_name": "petclinic:type=CallMonitor",
"object_alias": "${type}"
}
]
}

but now getting following error :

java.lang.Thread.run(java/lang/Thread.java:748)
[2018-06-06T18:30:23,422][INFO ][logstash.inputs.jmx ] Loading configuration files in path {:path=>"/etc/logstash/jmx/"}
[2018-06-06T18:30:23,429][ERROR][logstash.inputs.jmx ] Failed to retrieve RMIServer stub: javax.naming.CommunicationException [Root exception is java.rmi.ConnectIOException: non-JRMP server at remote endpoint]
[2018-06-06T18:30:23,429][ERROR][logstash.inputs.jmx ] javax.management.remote.rmi.RMIConnector.connect(javax/management/remote/rmi/RMIConnector.java:369)
javax.management.remote.JMXConnectorFactory.connect(javax/management/remote/JMXConnectorFactory.java:270)
java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:498)
org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:468)
org.jruby.javasupport.JavaMethod.invokeStaticDirect(org/jruby/javasupport/JavaMethod.java:370)
RUBY.create_connection(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jmx4r-0.1.4/lib/jmx4r.rb:220)
RUBY.connection(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jmx4r-0.1.4/lib/jmx4r.rb:137)
RUBY.thread_jmx(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jmx-3.0.6/lib/logstash/inputs/jmx.rb:217)
RUBY.block in run(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jmx-3.0.6/lib/logstash/inputs/jmx.rb:321)
org.jruby.RubyProc.call(org/jruby/RubyProc.java:289)
org.jruby.RubyProc.call(org/jruby/RubyProc.java:246)
java.lang.Thread.run(java/lang/Thread.java:748)
[2018-06-06T18:30:23,429][ERROR][logstash.inputs.jmx ] Failed to retrieve RMIServer stub: javax.naming.CommunicationException [Root exception is java.rmi.ConnectIOException: non-JRMP server at remote endpoint]
[2018-06-06T18:30:23,429][ERROR][logstash.inputs.jmx ] javax.management.remote.rmi.RMIConnector.connect(javax/management/remote/rmi/RMIConnector.java:369)
javax.management.remote.JMXConnectorFactory.connect(javax/management/remote/JMXConnectorFactory.java:270)
java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:498)
org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:468)
org.jruby.javasupport.JavaMethod.invokeStaticDirect(org/jruby/javasupport/JavaMethod.java:370)
RUBY.create_connection(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jmx4r-0.1.4/lib/jmx4r.rb:220)
RUBY.connection(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jmx4r-0.1.4/lib/jmx4r.rb:137)
RUBY.thread_jmx(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jmx-3.0.6/lib/logstash/inputs/jmx.rb:217)
RUBY.block in run(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jmx-3.0.6/lib/logstash/inputs/jmx.rb:321)
org.jruby.RubyProc.call(org/jruby/RubyProc.java:289)
org.jruby.RubyProc.call(org/jruby/RubyProc.java:246)
java.lang.Thread.run(java/lang/Thread.java:748)

again some missing configuration : ? plz

Is it a configuration error? Maybe. logstash has (I think) established a TCP connection to the remote server, but when it tried to talk JRMP to the remote endpoint it failed.

That could be caused by a lot of things. It would happen if the remote server did not talk JMX. And I believe it would happen if the remote server required SSL :slight_smile:

The underlying JMX::MBean in jmx4r does not appear to support SSL.

thanks. actually, i m learning ELK stack . do you have/suggest any web application which has the integration with ELK as a sample application ?
currently , i m using petclinic spring application but that is using jmx which is a complex thing at this time to learn.

i just want to get an idea , how application logs the messages and logstash send it to elasticsearch for filtration and indexing and finally visualization on kibana

please let me know if you suggest any application on github if any.

I would not use an app unless you are very much locked in at the devops end of ELK use cases. I would start with a dataset (any dataset) provided you are interested in it. The UK government has a website that lets you find datasets. So does New York City. Steve Balmer's USA Facts has a mass of US government data sets. The World Bank has data from all over the world. That's the kind of data I tend to look at, it may not help you directly. Find a dataset you are interested in from any available source.

There are many, many other sources for datasets. Take a dataset you are interested in. Run it through logstash with 'output { stdout { codec => rubydebug } }'. Do not even start ingesting into elasticsearch until you think enough entries are being parsed well. Then load it the default template (i.e. do not specify an index, use the default of "logstash-%{+YYYY.MM.dd}"). There will be time in the future to learn about index templates later.

Then Discover the data in Kibana and see if you like the look of it. If not, tune logstash. Then start trying to Visualize the data in Kibana. Do not be surprised if this is harder than you expect.

Visualization is sometimes "I know the story I want to tell, what visualization does that?". That tends to be easier than "What visualization tells me what the story is?".

Thanks. Its correct. I am very much locked in the DevOps end of ELK use cases. I know using the different options of getting the datasets which you proviced , I can get good learning and will get to know how the E2E workflow works. however, that is not my immediate bread/butter

plz let me know if any of the app I can use to get it working first and then go back to these datasets for deeper study
Regards,

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.