Filebeat not sending logs to logstash (port 5044 )

My filebeat is reading the log file but it 's not sending anything to logstash
Here are my filebeats.yml:

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so

# you can use different inputs for various configurations.

# Below are the input specific configurations.

- type: log

  # Change to true to enable this input configuration.

  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.

  paths:

    - C:\elasticstack\filebeat-7.10.2-windows-x86_64\filebeat-7.10.2-windows-x86_64\logs\test.log

filebeat.config.modules:

  # Glob pattern for configuration loading

  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading

  reload.enabled: false

  # Period on which files under path should be checked for changes

  #reload.period: 10s

#==================== Elasticsearch template setting ==========================

setup.template.settings:

  index.number_of_shards: 3



output.logstash:

  # The Logstash hosts

  hosts: ["localhost:5044"]

output.console:

  pretty: true

And here the logs it's providing me:

2021-02-16T14:45:08.181+0100    DEBUG   [input] input/input.go:139      Run input
2021-02-16T14:45:08.321+0100    DEBUG   [input] log/input.go:205        Start next scan
2021-02-16T14:45:08.339+0100    DEBUG   [input] log/input.go:439        Check file for harvesting: C:\elasticstack\filebeat-7.10.2-windows-x86_64\filebeat-7.10.2-windows-x86_64\logs\test.log
2021-02-16T14:45:08.342+0100    DEBUG   [input] log/input.go:530        Update existing file for harvesting: C:\elasticstack\filebeat-7.10.2-windows-x86_64\filebeat-7.10.2-windows-x86_64\logs\test.log, offset: 0
2021-02-16T14:45:08.363+0100    DEBUG   [input] log/input.go:582        Harvester for file is still running: C:\elasticstack\filebeat-7.10.2-windows-x86_64\filebeat-7.10.2-windows-x86_64\logs\test.log
2021-02-16T14:45:08.374+0100    DEBUG   [input] log/input.go:226        input states cleaned up. Before: 1, After: 1, Pending: 0
2021-02-16T14:45:11.981+0100    DEBUG   [harvester]     log/log.go:107  End of file reached: C:\elasticstack\filebeat-7.10.2-windows-x86_64\filebeat-7.10.2-windows-x86_64\logs\test.log; Backoff now.
2021-02-16T14:45:18.498+0100    DEBUG   [input] input/input.go:139      Run input
2021-02-16T14:45:18.518+0100    DEBUG   [input] log/input.go:205        Start next scan
2021-02-16T14:45:18.639+0100    DEBUG   [input] log/input.go:439        Check file for harvesting: C:\elasticstack\filebeat-7.10.2-windows-x86_64\filebeat-7.10.2-windows-x86_64\logs\test.log
2021-02-16T14:45:18.639+0100    DEBUG   [input] log/input.go:530        Update existing file for harvesting: C:\elasticstack\filebeat-7.10.2-windows-x86_64\filebeat-7.10.2-windows-x86_64\logs\test.log, offset: 0
2021-02-16T14:45:18.639+0100    DEBUG   [input] log/input.go:582        Harvester for file is still running: C:\elasticstack\filebeat-7.10.2-windows-x86_64\filebeat-7.10.2-windows-x86_64\logs\test.log
2021-02-16T14:45:18.639+0100    DEBUG   [input] log/input.go:226        input states cleaned up. Before: 1, After: 1, Pending: 0
2021-02-16T14:45:22.322+0100    DEBUG   [harvester]     log/log.go:107  End of file reached: C:\elasticstack\filebeat-7.10.2-windows-x86_64\filebeat-7.10.2-windows-x86_64\logs\test.log; Backoff now.
2021-02-16T14:45:26.701+0100    INFO    [monitoring]    log/log.go:145  Non-zero metrics in the last 30s        {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":125},"total":{"ticks":250,"time":{"ms":16},"value":250},"user":{"ticks":125,"time":{"ms":16}}},"handles":{"open":199},"info":{"ephemeral_id":"65318393-d82f-4e79-80a1-026b7282beb8","uptime":{"ms":60286}},"memstats":{"gc_next":17307248,"memory_alloc":9931160,"memory_total":44680864,"rss":8192},"runtime":{"goroutines":31}},"filebeat":{"harvester":{"open_files":1,"running":1}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":1,"events":{"active":0}}},"registrar":{"states":{"current":1}}}}}

And here is my logstash.conf (Simple one just for test)

input {

        

        beats { port => 5044 }

   }

output {

  elasticsearch {

  hosts => ["localhost:9200"] 

  

  }

    stdout { codec => rubydebug }

}

Welcome to our community! :smiley:

Looks like you may want to delete the registry entry for the file, or try putting more data into it.

Thank you for your answer
filebeat is reading the log file successflly it's shown in the console but it's not sending it to logstash
And here the logs it's providing me in logstash

"Using bundled JDK: ""
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
Terminate batch job (Y/N)? y

C:\elasticstack\logstash-7.10.2-windows-x86_64\logstash-7.10.2\bin>logstash.bat -f logstash.conf
"Using bundled JDK: ""
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.ext.openssl.SecurityHelper (file:/C:/Users/SChelly/AppData/Local/Temp/2/jruby-2356/jruby9069926537661472641jopenssl.jar) to field java.security.MessageDigest.provider
WARNING: Please consider reporting this to the maintainers of org.jruby.ext.openssl.SecurityHelper
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Sending Logstash logs to C:/elasticstack/logstash-7.10.2-windows-x86_64/logstash-7.10.2/logs which is now configured via log4j2.properties
[2021-02-17T09:06:15,056][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.10.2", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM 11.0.8+10 on 11.0.8+10 +indy +jit [mswin32-x86_64]"}
[2021-02-17T09:06:15,611][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2021-02-17T09:06:18,474][INFO ][org.reflections.Reflections] Reflections took 94 ms to scan 1 urls, producing 23 keys and 47 values
[2021-02-17T09:06:20,555][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2021-02-17T09:06:20,919][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2021-02-17T09:06:21,004][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2021-02-17T09:06:21,050][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2021-02-17T09:06:21,365][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2021-02-17T09:06:21,513][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2021-02-17T09:06:21,606][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["C:/elasticstack/logstash-7.10.2-windows-x86_64/logstash-7.10.2/bin/logstash.conf"], :thread=>"#<Thread:0x37dd9005 run>"}
[2021-02-17T09:06:21,838][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1, "index.lifecycle.name"=>"logstash-policy", "index.lifecycle.rollover_alias"=>"logstash"}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2021-02-17T09:06:23,469][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.8}
[2021-02-17T09:06:23,519][INFO ][logstash.inputs.beats    ][main] Starting input listener {:address=>"0.0.0.0:5044"}
[2021-02-17T09:06:23,577][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2021-02-17T09:06:23,915][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2021-02-17T09:06:24,222][INFO ][org.logstash.beats.Server][main][eb2132b2fc632f70483a4bcae12da26fefe384cd4206b5502bc8eba991315aea] Starting server on port: 5044
[2021-02-17T09:06:24,521][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

@warkolm

Do you have the firewall disabled for that port?

It's not disabled
how can i verify ?

I'm not sure how to do this in Windows, since I mostly work with Linux. I think you should turn off firewall off completely for a second to check if Logstash works after that. If yes, you know what to do (turn off firewall on port 5044).

i see this message in logs of logstash
there is a problem ?

java.lang.UnsupportedOperationException: Reflective setAccessible(true) disabled
        at io.netty.util.internal.ReflectionUtil.trySetAccessible(ReflectionUtil.java:31) ~[netty-all-4.1.49.Final.jar:4.1.49.Final]
        at io.netty.util.internal.PlatformDependent0$4.run(PlatformDependent0.java:233) ~[netty-all-4.1.49.Final.jar:4.1.49.Final]
        at java.security.AccessController.doPrivileged(Native Method) ~[?:?]
        at io.netty.util.internal.PlatformDependent0.<clinit>(PlatformDependent0.java:227) [netty-all-4.1.49.Final.jar:4.1.49.Final]
        at io.netty.util.internal.PlatformDependent.isAndroid(PlatformDependent.java:289) [netty-all-4.1.49.Final.jar:4.1.49.Final]
        at io.netty.util.internal.PlatformDependent.<clinit>(PlatformDependent.java:92) [netty-all-4.1.49.Final.jar:4.1.49.Final]
        at io.netty.channel.nio.NioEventLoop.newTaskQueue0(NioEventLoop.java:279) [netty-all-4.1.49.Final.jar:4.1.49.Final]
        at io.netty.channel.nio.NioEventLoop.newTaskQueue(NioEventLoop.java:150) [netty-all-4.1.49.Final.jar:4.1.49.Final]
        at io.netty.channel.nio.NioEventLoop.<init>(NioEventLoop.java:138) [netty-all-4.1.49.Final.jar:4.1.49.Final]
        at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:146) [netty-all-4.1.49.Final.jar:4.1.49.Final]
        at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:37) [netty-all-4.1.49.Final.jar:4.1.49.Final]
        at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:84) [netty-all-4.1.49.Final.jar:4.1.49.Final]
        at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:58) [netty-all-4.1.49.Final.jar:4.1.49.Final]
        at io.netty.channel.MultithreadEventLoopGroup.<init>(MultithreadEventLoopGroup.java:52) [netty-all-4.1.49.Final.jar:4.1.49.Final]
        at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:96) [netty-all-4.1.49.Final.jar:4.1.49.Final]
        at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:91) [netty-all-4.1.49.Final.jar:4.1.49.Final]
        at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:72) [netty-all-4.1.49.Final.jar:4.1.49.Final]
        at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:52) [netty-all-4.1.49.Final.jar:4.1.49.Final]
        at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:44) [netty-all-4.1.49.Final.jar:4.1.49.Final]
        at org.logstash.beats.Server.listen(Server.java:48) [logstash-input-beats-6.0.12.jar:?]
        at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
        at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
        at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
        at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
        at org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(JavaMethod.java:426) [jruby-complete-9.2.13.0.jar:?]
        at org.jruby.javasupport.JavaMethod.invokeDirect(JavaMethod.java:293) [jruby-complete-9.2.13.0.jar:?]
        at org.jruby.java.invokers.InstanceMethodInvoker.call(InstanceMethodInvoker.java:24) [jruby-complete-9.2.13.0.jar:?]
        at org.jruby.java.invokers.InstanceMethodInvoker.call(InstanceMethodInvoker.java:86) [jruby-complete-9.2.13.0.jar:?]

I've had Logstash logs similar to this. Do you get your logs from an external elastic cluster and can I see your Filebeat input configuration?

this is filebeat.yml

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so

# you can use different inputs for various configurations.

# Below are the input specific configurations.

- type: log

  # Change to true to enable this input configuration.

  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.

  paths:

    - C:\elasticstack\filebeat-7.10.2-windows-x86_64\filebeat-7.10.2-windows-x86_64\logs\test.log

filebeat.config.modules:

  # Glob pattern for configuration loading

  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading

  reload.enabled: false

  # Period on which files under path should be checked for changes

  #reload.period: 10s

#==================== Elasticsearch template setting ==========================

setup.template.settings:

  index.number_of_shards: 3



output.logstash:

  # The Logstash hosts

  hosts: ["localhost:5044"]

Can you please answer this?

sorry I don't understand your question ! can you explain to me what do you mean exactly ?

Is this where you want the logs to come from or do you have an external elk stack?

In my case the logs that you have in Logstash meant that the logs of the external cluster were being forwarded to port 5044. This does not work. When I tried to forward to another port e.g. 9201 and changed the filebeat.yml input to:

- type: tcp
  host: "localhost:9201"

everything worked fine. I hope I made it clear enough and ALL of this does not apply if you DO NOT have an external Kubernetes cluster.

Note that if you try this confiuration you have to comment out the - type: log and everything associated with it from the previous configuration.

I'm asking this question, because I don't find any flaws in your configuration files. By the way, what did you change to get these logs of Logstash? Because earlier you didn't have these logs.

yes i want the logs to come from this path

C:\elasticstack\filebeat-7.10.2-windows-x86_64\filebeat-7.10.2-windows-x86_64\logs\test.log

i don't have an external elk stack.
I added --debug in the command to see these logs.I still don't know how to resolve the problem.
Thank you

I'm very sorry, I can't seem to find your issue. I really hope someone smarter than me can help you solve this problem.

1 Like

thank you for replying anyway :slight_smile:

@syrine_chelly you mentioned that the log is successfuly shown in the console, but not sent to logstash.

Did you remove your data directory after the test with the console output?

Filebeat can only have an output enabled at the same time, so when it reads the file to print it to the console, it surely doesn't send it to Logstash.

When Filebeat reads a file, it stores the last read position in a registry in the data directory, so it doesn't read the same lines twice, even if it is restarted. So if Filebeat did read the file when the console output was configured, it wouldn't read the file again unless the registry is removed.

Try to stop Filebeat, remove the data directory in Filebeat installation directory and start Filebeat again with the logstash output configured.

Thank you ! it works

1 Like