Connect Filebeat to logstash

Hi,

I am trying to setup filebeat to logstash and get below errors at filebeat and logstash end:

filebeat; Version: 7.7.0
logstash "number" : "7.8.0"

  1. Modified /etc/filebeat/filebeat.yml:
    enabled: true
    paths:
    commented output.elasticsearch
    uncommented output.logstash and added hosts: ["hostname:5044"]

  2. Modified /etc/logstash/conf.d/beats_elasticsearch.conf:
    input {
    beats {
    port => 5044
    }
    }

#filter {
#}

output {
elasticsearch {
hosts => ["hostname:9200"]
}
}

I started filebeat and got below error:

2020-07-06T08:51:23.912-0700 ERROR [publisher_pipeline_output] pipeline/output.go:106 Failed to connect to backoff(elasticsearch(http://hostname:5044)): Get http://hostname:5044: dial tcp ip_address:5044: connect: connection refused

Started logstash and its log below:

[INFO ] 2020-07-06 09:00:20.562 [[main]<beats] Server - Starting server on port: 5044
[INFO ] 2020-07-06 09:00:20.835 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}
[INFO ] 2020-07-06 09:00:45.266 [defaultEventExecutorGroup-4-1] BeatsHandler - [local: x.x.x.x:5044, remote: x.x.x.x:53628] Handling exception: org.logstash.beats.InvalidFrameProtocolException: Invalid version of beats protocol: 71
[WARN ] 2020-07-06 09:00:45.267 [nioEventLoopGroup-2-2] DefaultChannelPipeline - An exceptionCaught() event was fired, and it reached at the tail of the pipeline. It usually means the last handler in the pipeline did not handle the exception.
io.netty.handler.codec.DecoderException: org.logstash.beats.InvalidFrameProtocolException: Invalid version of beats protocol: 71

Please explain what else I should do.

Started filebeat and logstash as:
sudo /usr/share/filebeat/bin/filebeat -e -c /etc/filebeat/filebeat.yml
sudo /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/beats_elasticsearch.conf

Thanks

Failed to connect to elasticsearch? Are you sure you commented out the elasticsearch output and uncommented the logstash output in the filebeat configuration?

#-------------------------- Elasticsearch output ------------------------------
#output.elasticsearch:
' # Array of hosts to connect to.
'# hosts: ["localhost:9200"]
'# Protocol - either .
'#protocol: "https"

'# Authentication credentials - either API key or username/password.
#api_key: "id:api_key"
#username: "elastic"
#password: "changeme"

I have commented these lines in /etc/filebeat/filebeat.yml. Is there any other file that I need to modify? Don't know why the log is showing elasticsearch

Filebeat log:

2020-07-06T08:51:17.846-0700 INFO instance/beat.go:621 Home path: [/usr/share/filebeat/bin] Config path: [/usr/share/filebeat/bin] Data path: [/usr/share/fi
lebeat/bin/data] Logs path: [/usr/share/filebeat/bin/logs]
2020-07-06T08:51:18.004-0700 INFO instance/beat.go:629 Beat ID: b66a32ee-cd70-4b8e-a9bc-aa01963856c5
2020-07-06T08:51:18.009-0700 INFO [seccomp] seccomp/seccomp.go:101 Syscall filter could not be installed because the kernel does not support seccomp
2020-07-06T08:51:18.009-0700 INFO [beat] instance/beat.go:957 Beat info {“system_info”: {“beat”: {“path”: {“config”: “/usr/share/filebeat/bin”, “data
“: “/usr/share/filebeat/bin/data”, “home”: “/usr/share/filebeat/bin”, “logs”: “/usr/share/filebeat/bin/logs”}, “type”: “filebeat”, “uuid”: “b66a32ee-cd70-4b8e-a9bc-a
a01963856c5”}}}
2020-07-06T08:51:18.009-0700 INFO [beat] instance/beat.go:966 Build info {“system_info”: {“build”: {“commit”: “5e69e25b920e3d93bec76a09a31da3ab35a5560
7”, “libbeat”: “7.7.0”, “time”: “2020-05-12T00:53:16.000Z”, “version”: “7.7.0”}}}
2020-07-06T08:51:18.009-0700 INFO [beat] instance/beat.go:969 Go runtime info {“system_info”: {“go”: {“os”:“linux”,“arch”:“amd64”,“max_procs”:4,“version”:”
go1.13.9"}}}
2020-07-06T08:51:18.010-0700 INFO [beat] instance/beat.go:973 Host info {“system_info”: {“host”: {“architecture”:“x86_64",“boot_time”:“2020-06-09T04:
40:03-07:00",“containerized”:false,“name”:“”,“ip”:[“127.0.0.1/8”,“10.102.153.239/20"],“kernel_version”:“2.6.32-754.28.1.el6.x86_64",
“mac”:[“00:50:56:a0:1f:61"],“os”:{“family”:“”,“platform”:“ol”,“name”:“Oracle Linux Server”,“version”:“6.10”,“major”:6,“minor”:10,“patch”:0},“timezone”:“PDT”,“timezon
e_offset_sec”:-25200,“id”:“6cd1ecbb634d20276a322ec00000000f”}}}
2020-07-06T08:51:18.010-0700 INFO [beat] instance/beat.go:1002 Process info {“system_info”: {“process”: {“capabilities”: {“inheritable”:null,“permitted”:
[“chown”,“dac_override”,“dac_read_search”,“fowner”,“fsetid”,“kill”,“setgid”,“setuid”,“setpcap”,“linux_immutable”,“net_bind_service”,“net_broadcast”,“net_admin”,“net_
raw”,“ipc_lock”,“ipc_owner”,“sys_module”,“sys_rawio”,“sys_chroot”,“sys_ptrace”,“sys_pacct”,“sys_admin”,“sys_boot”,“sys_nice”,“sys_resource”,“sys_time”,“sys_tty_confi
g”,“mknod”,“lease”,“audit_write”,“audit_control”,“setfcap”,“mac_override”,“mac_admin”,“syslog”,“wake_alarm”,“block_suspend”,“audit_read”,“38”,“39",“40”,“41",“42”,“43
“,”44",“45”,“46",“47”,“48",“49”,“50",“51”,“52",“53”,“54",“55”,“56",“57”,“58",“59”,“60",“61”,“62",“63”],“effective”:[“chown”,“dac_override”,“dac_read_search”,“fowner”
,“fsetid”,“kill”,“setgid”,“setuid”,“setpcap”,“linux_immutable”,“net_bind_service”,“net_broadcast”,“net_admin”,“net_raw”,“ipc_lock”,“ipc_owner”,“sys_module”,“sys_rawi
o”,“sys_chroot”,“sys_ptrace”,“sys_pacct”,“sys_admin”,“sys_boot”,“sys_nice”,“sys_resource”,“sys_time”,“sys_tty_config”,“mknod”,“lease”,“audit_write”,“audit_control”,”
setfcap”,“mac_override”,“mac_admin”,“syslog”,“wake_alarm”,“block_suspend”,“audit_read”,“38”,“39",“40”,“41",“42”,“43",“44”,“45",“46”,“47",“48”,“49",“50”,“51",“52”,“53
“,”54",“55”,“56",“57”,“58",“59”,“60",“61”,“62",“63”],“bounding”:[“chown”,“dac_override”,“dac_read_search”,“fowner”,“fsetid”,“kill”,“setgid”,“setuid”,“setpcap”,“linux
immutable”,“net_bind_service”,“net_broadcast”,“net_admin”,“net_raw”,“ipc_lock”,“ipc_owner”,“sys_module”,“sys_rawio”,“sys_chroot”,“sys_ptrace”,“sys_pacct”,“sys_admin
“,”sys_boot”,“sys_nice”,“sys_resource”,“sys_time”,“sys_tty_config”,“mknod”,“lease”,“audit_write”,“audit_control”,“setfcap”,“mac_override”,“mac_admin”,“syslog”,“wake

alarm”,“block_suspend”,“audit_read”,“38",“39”,“40",“41”,“42",“43”,“44",“45”,“46",“47”,“48",“49”,“50",“51”,“52",“53”,“54",“55”,“56",“57”,“58",“59”,“60",“61”,“62",“63”
],“ambient”:null}, “cwd”: “/var/log/filebeat”, “exe”: “/usr/share/filebeat/bin/filebeat”, “name”: “filebeat”, “pid”: 8941, “ppid”: 8940, “seccomp”: {“mode”:“”}, “sta
rt_time”: “2020-07-06T08:51:17.070-0700”}}}
2020-07-06T08:51:18.011-0700 INFO instance/beat.go:297 Setup Beat: filebeat; Version: 7.7.0
2020-07-06T08:51:18.011-0700 INFO [index-management] idxmgmt/std.go:182 Set output.elasticsearch.index to ‘filebeat-7.7.0’ as ILM is enabled.
2020-07-06T08:51:18.011-0700 INFO eslegclient/connection.go:84 elasticsearch url: :5044
2020-07-06T08:51:18.011-0700 INFO [publisher] pipeline/module.go:110 Beat name:
2020-07-06T08:51:18.011-0700 ERROR fileset/modules.go:125 Not loading modules. Module directory not found: /usr/share/filebeat/bin/module
2020-07-06T08:51:18.011-0700 INFO [monitoring] log/log.go:118 Starting metrics logging every 30s
2020-07-06T08:51:18.011-0700 INFO instance/beat.go:438 filebeat start running.
2020-07-06T08:51:18.012-0700 INFO registrar/migrate.go:104 No registry home found. Create: /usr/share/filebeat/bin/data/registry/filebeat
2020-07-06T08:51:18.012-0700 INFO registrar/migrate.go:112 Initialize registry meta file
2020-07-06T08:51:18.022-0700 INFO registrar/registrar.go:108 No registry file found under: /usr/share/filebeat/bin/data/registry/filebeat/data.json. Creat
ing a new registry file.
2020-07-06T08:51:18.039-0700 INFO registrar/registrar.go:145 Loading registrar data from /usr/share/filebeat/bin/data/registry/filebeat/data.json
2020-07-06T08:51:18.039-0700 INFO registrar/registrar.go:152 States Loaded from registrar: 0
2020-07-06T08:51:18.039-0700 INFO beater/crawler.go:73 Loading Inputs: 1
2020-07-06T08:51:18.040-0700 INFO log/input.go:152 Configured paths: []
2020-07-06T08:51:18.040-0700 INFO input/input.go:114 Starting input of type: log; ID: 13172526264431030025
2020-07-06T08:51:18.040-0700 INFO beater/crawler.go:105 Loading and starting Inputs completed. Enabled inputs: 1
2020-07-06T08:51:18.040-0700 INFO cfgfile/reload.go:175 Config reloader started
2020-07-06T08:51:18.040-0700 INFO cfgfile/reload.go:235 Loading of config files completed.
2020-07-06T08:51:18.044-0700 INFO log/harvester.go:297 Harvester started for file:
2020-07-06T08:51:21.008-0700 INFO [add_cloud_metadata] add_cloud_metadata/add_cloud_metadata.go:89 add_cloud_metadata: hosting provider type not detected.
2020-07-06T08:51:22.009-0700 INFO [publisher_pipeline_output] pipeline/output.go:101 Connecting to backoff(elasticsearch(hostname:5044))
2020-07-06T08:51:23.912-0700 ERROR [publisher_pipeline_output] pipeline/output.go:106 Failed to connect to backoff(elasticsearch(hostname:5044)): Get x.x.x.x:5044: dial tcp x.x.x.x:5044: connect: connection refused
2020-07-06T08:51:45.927-0700 INFO [publisher] pipeline/retry.go:196 retryer: send unwait-signal to consumer
2020-07-06T08:51:45.927-0700 INFO [publisher] pipeline/retry.go:198 done
2020-07-06T08:51:45.927-0700 INFO [publisher] pipeline/retry.go:173 retryer: send wait signal to consumer
2020-07-06T08:51:45.927-0700 INFO [publisher] pipeline/retry.go:175 done
2020-07-06T08:51:48.013-0700 INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {“monitoring”: {“metrics”: {“beat”:{“cpu”:{“system”:{“ticks”:40,“time”:{“ms”:47}},“total”:{“ticks”:150,“time”:{“ms”:165},“value”:150},“user”:{“ticks”:110,“time”:{“ms”:118}}},“handles”:{“limit”:{“hard”:32768,“soft”:32768},“open”:11},“info”:{“ephemeral_id”:“119506af-cd7d-4c96-b0cf-7818ead71d00",“uptime”:{“ms”:30243}},“memstats”:{“gc_next”:15232448,“memory_alloc”:10777104,“memory_total”:22170408,“rss”:44703744},“runtime”:{“goroutines”:27}},“filebeat”:{“events”:{“active”:1103,“added”:1104,“done”:1},“harvester”:{“files”:{“d68421b0-d1e7-4204-9819-89899f28259f”:{“last_event_published_time”:“2020-07-06T08:51:21.033Z”,“last_event_timestamp”:“2020-07-06T08:51:21.033Z”,“name”:“”,“read_offset”:121278,“size”:121278,“start_time”:“2020-07-06T08:51:18.040Z”}},“open_files”:1,“running”:1,“started”:1}},“libbeat”:{“config”:{“module”:{“running”:0},“reloads”:1,“scans”:1},“output”:{“type”:“elasticsearch”},“pipeline”:{“clients”:1,“events”:{“active”:1070,“filtered”:34,“published”:1070,“retry”:100,“total”:1104}}},“registrar”:{“states”:{“current”:1,“update”:1},“writes”:{“success”:2,“total”:2}},“system”:{“cpu”:{“cores”:4},“load”:{“1”:1.09,“15":1.01,“5”:1.06,“norm”:{“1”:0.2725,“15":0.2525,“5”:0.265}}}}}}
2020-07-06T08:52:12.514-0700 ERROR [publisher_pipeline_output] pipeline/output.go:106 Failed to connect to backoff(elasticsearch(http://:5044)): Get http://:5044: dial tcp 10.102.130.7:5044: connect: connection refused
2020-07-06T08:52:12.514-0700 INFO [publisher_pipeline_output] pipeline/output.go:99 Attempting to reconnect to backoff(elasticsearch(http://:5044)) with 5 reconnect attempt(s)

logstash log:

WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[INFO ] 2020-07-06 09:00:16.453 [main] writabledirectory - Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
[INFO ] 2020-07-06 09:00:16.471 [main] writabledirectory - Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
[WARN ] 2020-07-06 09:00:16.754 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2020-07-06 09:00:16.758 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"7.8.0", "jruby.version"=>"jruby 9.2.11.1 (2.5.7) 2020-03-25 b1f55b1a40 Java HotSpot(TM) 64-Bit Server VM 25.212-b10 on 1.8.0_212-b10 +indy +jit [linux-x86_64]"}
[INFO ] 2020-07-06 09:00:16.777 [LogStash::Runner] agent - No persistent UUID file found. Generating new UUID {:uuid=>"bb727ea0-db3d-42f6-9533-e542f4789276", :path=>"/usr/share/logstash/data/uuid"}
[INFO ] 2020-07-06 09:00:18.282 [Converge PipelineAction::Create] Reflections - Reflections took 39 ms to scan 1 urls, producing 21 keys and 41 values
[INFO ] 2020-07-06 09:00:18.972 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://:9200/, http://:9200/, http:// .net:9200/]}}
[WARN ] 2020-07-06 09:00:19.125 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http:// :9200/"}
[INFO ] 2020-07-06 09:00:19.288 [[main]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7}
[WARN ] 2020-07-06 09:00:19.290 [[main]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[WARN ] 2020-07-06 09:00:19.339 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http:// :9200/"}
[WARN ] 2020-07-06 09:00:19.399 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http:// :9200/"}
[INFO ] 2020-07-06 09:00:19.460 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//:9200", "//:9200", "//:9200"]}
[INFO ] 2020-07-06 09:00:19.498 [Ruby-0-Thread-6: :1] elasticsearch - Using default mapping template
[INFO ] 2020-07-06 09:00:19.552 [[main]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/etc/logstash/conf.d/beats_elasticsearch.conf"], :thread=>"#<Thread:0x18b6891f run>"}
[INFO ] 2020-07-06 09:00:19.598 [Ruby-0-Thread-6: :1] elasticsearch - Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1, "index.lifecycle.name"=>"logstash-policy", "index.lifecycle.rollover_alias"=>"logstash"}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[INFO ] 2020-07-06 09:00:20.358 [[main]-pipeline-manager] beats - Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[INFO ] 2020-07-06 09:00:20.386 [[main]-pipeline-manager] javapipeline - Pipeline started {"pipeline.id"=>"main"}
[INFO ] 2020-07-06 09:00:20.466 [Agent thread] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[INFO ] 2020-07-06 09:00:20.562 [[main]<beats] Server - Starting server on port: 5044
[INFO ] 2020-07-06 09:00:20.835 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}
[INFO ] 2020-07-06 09:00:45.266 [defaultEventExecutorGroup-4-1] BeatsHandler - [local: x.x.x.x.:5044, remote: x.x.x.x:53628] Handling exception: org.logstash.beats.InvalidFrameProtocolException: Invalid version of beats protocol: 71
[WARN ] 2020-07-06 09:00:45.267 [nioEventLoopGroup-2-2] DefaultChannelPipeline - An exceptionCaught() event was fired, and it reached at the tail of the pipeline. It usually means the last handler in the pipeline did not handle the exception.
io.netty.handler.codec.DecoderException: org.logstash.beats.InvalidFrameProtocolException: Invalid version of beats protocol: 71
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:472) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:278) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.channel.AbstractChannelHandlerContext.access$600(AbstractChannelHandlerContext.java:38) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.channel.AbstractChannelHandlerContext$7.run(AbstractChannelHandlerContext.java:353) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.util.concurrent.DefaultEventExecutor.run(DefaultEventExecutor.java:66) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:897) [netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-all-4.1.30.Final.jar:4.1.30.Final]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_212]
Caused by: org.logstash.beats.InvalidFrameProtocolException: Invalid version of beats protocol: 71
at org.logstash.beats.Protocol.version(Protocol.java:22) ~[logstash-input-beats-6.0.9.jar:?]
at org.logstash.beats.BeatsParser.decode(BeatsParser.java:62) ~[logstash-input-beats-6.0.9.jar:?]
at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:502) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:441) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
... 8 more

Hi,

1] Please check both version of logstash and filebeat and their compatibility
2] If its linux vm you are trying , check once firewall rules also check if its getting connected using telnet

Hi Neha,

nc -zv 5044 shows successful and filebeat 7.7.0 is compatible with logstash 7.8.0.
Checking for firewall

Worked fine after upgrading filebeat to 7.8.0

Great!