I've been trying to get a home monitoring system up and running and i've fallen flat.
My goal was to get MQTT and other messages into filebeat, thru logstash and into Kibana to build a dashboard.
I've followed a few guides on how to set up a Ubuntu host. Previously I was able to get Filebeat logs into Kibana but they seem to be bypassing logstash and i'm not sure why. The reason I want to use logstash is so that I can parse the MQTT messages.
I'm not sure what information is required to help me here so i'm publishing all that I can
user@ELK:~$ sudo filebeat -e -c filebeat.yml -d "publish"@
[sudo] password for user:
{"log.level":"info","@timestamp":"2024-02-03T12:11:13.094+0700","log.origin":{"function":"github.com/elastic/beats/v7/libbeat/cmd/instance.(*Beat).configure","file.name":"instance/beat.go","file.line":811},"message":"Home path: [/usr/share/filebeat] Config path: [/etc/filebeat] Data path: [/var/lib/filebeat] Logs path: [/var/log/filebeat]","service.name":"filebeat","ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2024-02-03T12:11:13.094+0700","log.origin":{"function":"github.com/elastic/beats/v7/libbeat/cmd/instance.(*Beat).configure","file.name":"instance/beat.go","file.line":819},"message":"Beat ID: a574dab3-2a5b-4b87-a747-3b1075bc661d","service.name":"filebeat","ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2024-02-03T12:11:16.097+0700","log.logger":"add_cloud_metadata","log.origin":{"function":"github.com/elastic/beats/v7/libbeat/processors/add_cloud_metadata.(*addCloudMetadata).init.func1","file.name":"add_cloud_metadata/add_cloud_metadata.go","file.line":100},"message":"add_cloud_metadata: hosting provider type not detected.","service.name":"filebeat","ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2024-02-03T12:11:17.703+0700","log.origin":{"function":"github.com/elastic/beats/v7/libbeat/cmd/instance.(*Beat).launch","file.name":"instance/beat.go","file.line":430},"message":"filebeat stopped.","service.name":"filebeat","ecs.version":"1.6.0"}
{"log.level":"error","@timestamp":"2024-02-03T12:11:17.703+0700","log.origin":{"function":"github.com/elastic/beats/v7/libbeat/cmd/instance.handleError","file.name":"instance/beat.go","file.line":1312},"message":"Exiting: /var/lib/filebeat/filebeat.lock: data path already locked by another beat. Please make sure that multiple beats are not sharing the same data path (path.data)","service.name":"filebeat","ecs.version":"1.6.0"}
Exiting: /var/lib/filebeat/filebeat.lock: data path already locked by another beat. Please make sure that multiple beats are not sharing the same data path (path.data)
user@ELK:~$
from logstash.yml file. - "Invalid version of beats protocol: 60" - I commented out some of the filebeats.yml file for troubleshooting.
[2024-02-03T11:20:07,778][INFO ][org.logstash.beats.BeatsHandler][main][0584ea2ca64206b366d49f9cec829e66bb9e36e24135690c55dc57f3ad28d327] [local: 127.0.0.1:5044, remote: 12>
[2024-02-03T11:20:07,778][WARN ][io.netty.channel.DefaultChannelPipeline][main][0584ea2ca64206b366d49f9cec829e66bb9e36e24135690c55dc57f3ad28d327] An exceptionCaught() event>
io.netty.handler.codec.DecoderException: org.logstash.beats.InvalidFrameProtocolException: Invalid version of beats protocol: 69
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:499) ~[netty-codec-4.1.100.Final.jar:4.1.100.Final]
at io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:426) ~[netty-codec-4.1.100.Final.jar:4.1.100.Final]
at io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:393) ~[netty-codec-4.1.100.Final.jar:4.1.100.Final]
at io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:376) ~[netty-codec-4.1.100.Final.jar:4.1.100.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:305) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
at io.netty.channel.AbstractChannelHandlerContext.access$300(AbstractChannelHandlerContext.java:61) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
at io.netty.channel.AbstractChannelHandlerContext$4.run(AbstractChannelHandlerContext.java:286) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:173) ~[netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.DefaultEventExecutor.run(DefaultEventExecutor.java:66) ~[netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-common-4.1.100.Final.jar:4.1.100.Final]
at java.lang.Thread.run(Thread.java:840) [?:?]
Here is the filebeats.yml
filebeat.inputs:
#- type: filestream
# id: logify-id
# tags: ["logify"]
# enabled: true
# paths:
# - /var/log/logify/app.log
#- type: filestream
# enabled: true
# id: auth-id
# tags: ["system"]
# paths:
# - /var/log/auth.log
#- type: filestream
# enabled: true
# id: speedtest-id
# tags: ["speedtest"]
# paths:
# - /var/log/speedtests.log
- type: mqtt
enabled: true
id: mqtt-sensor-id
tags: ["mqtt"]
hosts:
- tcp://127.0.0.1:1883
username: sensor
password: sensorMQTT
topics:
- '#'
- /GV/Outdoor/Sonoff-OutdoorLights/stat/RESULT
setup.kibana:
host: "localhost:5601"
output.logstash:
# The Logstash hosts
hosts: ["192.168.21.102:5044"]
Here is my filebats input /etc/logstash/conf.d/02-beats-input.conf
input {
beats {
port => 5044
}
}
Here is 30-elasticsearch-output.conf
output {
if [@metadata][pipeline] {
elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
pipeline => "%{[@metadata][pipeline]}"
}
} else {
elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}
}
Here is when I run the logstash test
user@ELK:/var/log/logstash$ sudo -u logstash /usr/share/logstash/bin/logstash --path.settings /etc/logstash -t
Using bundled JDK: /usr/share/logstash/jdk
/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/concurrent-ruby-1.1.9/lib/concurrent-ruby/concurrent/executor/java_thread_pool_executor.rb:13: warning: method redefined; discarding old to_int
/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/concurrent-ruby-1.1.9/lib/concurrent-ruby/concurrent/executor/java_thread_pool_executor.rb:13: warning: method redefined; discarding old to_f
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2024-02-03T12:22:05,222][INFO ][logstash.runner ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2024-02-03T12:22:05,243][WARN ][logstash.runner ] The use of JAVA_HOME has been deprecated. Logstash 8.0 and later ignores JAVA_HOME and uses the bundled JDK. Running Logstash with the bundled JDK is recommended. The bundled JDK has been verified to work with each specific version of Logstash, and generally provides best performance and reliability. If you have compelling reasons for using your own JDK (organizational-specific compliance requirements, for example), you can configure LS_JAVA_HOME to use that version instead.
[2024-02-03T12:22:05,245][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.12.0", "jruby.version"=>"jruby 9.4.5.0 (3.1.4) 2023-11-02 1abae2700f OpenJDK 64-Bit Server VM 17.0.9+9 on 17.0.9+9 +indy +jit [x86_64-linux]"}
[2024-02-03T12:22:05,248][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2024-02-03T12:22:05,252][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-02-03T12:22:05,252][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-02-03T12:22:06,730][INFO ][org.reflections.Reflections] Reflections took 118 ms to scan 1 urls, producing 132 keys and 468 values
[2024-02-03T12:22:07,261][INFO ][logstash.javapipeline ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
Configuration OK
[2024-02-03T12:22:07,262][INFO ][logstash.runner ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
Lastly, I did notice this message in the kibana dashboard logs
{"log.level":"error","@timestamp":"2024-02-03T12:53:39.237+0700","log.logger":"publisher_pipeline_output","log.origin":{"function":"github.com/elastic/beats/v7/libbeat/publisher/pipeline.
(*netClientWorker).publishBatch","file.name":"pipeline/client_worker.go","file.line":174},"message":"failed to publish events: write tcp 192.168.21.102:53734->192.168.21.102:5044: write: connection reset by peer","service.name":"filebeat","ecs.version":"1.6.0"}
I'm a bit overwhelmed with all of this as a first time user. In over my head as they say