Logstash output influxdb configure problem

Influxdb server ip = 192.168.30.2
logstash server ip = 192.168.30.3
logstash.version"=>"6.5.1
InfluxDB shell version: 1.6.4

cat /etc/logstash/conf.d/02-beats-input.conf

input {
beats {
port => 5044
ssl => true
type => "json"
ssl_certificate => "/etc/pki/tls/certs/onlyoffice.crt"
ssl_key => "/etc/pki/tls/private/onlyoffice.key"
}
}

cat /etc/logstash/conf.d/30-influxdb-output.conf
output {
influxdb {
host => ["192.168.30.2:8086"]
user => "admin"
password => "admin"
db => "dac_demo"
measurement => "logstash"
allow_time_override => "true"
time_precision => "s"
use_event_fields_for_data_points => "true"
exclude_fields => ["@version","@timestamp","type","host","command","sar_time","series_name"]
}
stdout {
codec => rubydebug
}
}

I have some grok files in /etc/logstash/conf.d/.

/usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/ --config.test_and_exit
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[INFO ] 2018-12-05 17:21:33.747 [LogStash::Runner] runner - Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash

/usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/
[INFO ] 2018-12-05 17:23:14.881 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"6.5.1"}
[[main]-pipeline-manager] pipeline - Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash:: .............................
[Converge PipelineAction::Create] agent - Failed to execute action {:id=>:main, :action_type=>LogStash
[Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}

But not listening port 5044
I wish to input all fields in influxdb.

What is my wrong?
Please help me.

Would you be able to post a little more of the error message? It looks like it's cut off

Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::

[2018-12-07T21:59:38,031][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-12-07T22:00:09,710][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.5.1"}
[2018-12-07T22:00:55,758][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-12-07T22:00:56,631][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2018-12-07T22:00:56,878][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2018-12-07T22:00:57,073][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2018-12-07T22:00:57,246][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x50a2d218 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @id="8241400b966d259ede6710e282b5da9372d969260f2c62608fc0fb914c5f22d1", @klass=LogStash::Filters::Grok, @metric_events=#LogStash::Instrument::NamespacedMetric:0x1cfb045c, @filter=<LogStash::Filters::Grok match=>{"message"=>"^%{POSTFIX_ANVIL}$"}, add_tag=>["_grok_postfix_success"], id=>"8241400b966d259ede6710e282b5da9372d969260f2c62608fc0fb914c5f22d1", patterns_dir=>["/etc/logstash/patterns.d"], tag_on_failure=>["_grok_postfix_anvil_nomatch"], enable_metric=>true, periodic_flush=>false, patterns_files_glob=>"*", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, timeout_millis=>30000, tag_on_timeout=>"_groktimeout">>", :error=>"pattern %{POSTFIX_ANVIL} not defined", :thread=>"#<Thread:0x75ca5732 run>"}
[2018-12-07T22:00:58,437][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{POSTFIX_ANVIL} not defined>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:123:in block in compile'", "org/jruby/RubyKernel.java:1292:inloop'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:93:in compile'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.4/lib/logstash/filters/grok.rb:281:inblock in register'", "org/jruby/RubyArray.java:1734:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.4/lib/logstash/filters/grok.rb:275:inblock in register'", "org/jruby/RubyHash.java:1343:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.4/lib/logstash/filters/grok.rb:270:inregister'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:242:in register_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:253:inblock in register_plugins'", "org/jruby/RubyArray.java:1734:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:253:inregister_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:595:in maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:263:instart_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:200:in run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:160:inblock in start'"], :thread=>"#<Thread:0x75ca5732 run>"}
[2018-12-07T22:00:58,451][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}

There is your error!
That GROK pattern does not exist :slight_smile:

problem solve

Is your InfluxDB running? Looks like it cannot connect to it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.