Beat(Nomad cluster>Logstash>Elasticsearch>Graylog

I am new to Elasticsearch. I have filebeat running on Nomad cluster as system job. I have to collect logs and ship it to a centralized server which has Logstash, Elasticsearch and graylog running on docker-compose. Some how My filebeat is not able to ship the logs to logstash. Maybe because I am not pointing correct ports.
This is my filebeat.hcl file

job "filebeat" {
  datacenters = ["abc"]

  type = "system"
  update {
    min_healthy_time = "10s"
    healthy_deadline = "5m"
    progress_deadline = "10m"
    auto_revert = true
  }
  
  group "filebeat" {
    task "filebeat" {
      driver = "docker"

      config {
        image = "docker.elastic.co/beats/filebeat:7.10.0"
        args = [
          "-c", "/local/filebeat.yml",
          "--path.data", "/alloc/data/filebeat",
          "--path.logs", "/alloc/logs/filebeat",
        ]
        mount {
          type     = "bind"
          source   = "local/filebeat.yml"
          target   = "/usr/share/filebeat/filebeat.yml"
          readonly = true
        }
      } 
      template {
        data = <<template
  filebeat.inputs:
    -
      paths:
        - /alloc/logs/*.stdout.[0-9]*
      exclude_files: ['\.fifo$']  
      type: filestream
      scan_frequency: 1s
      fields_under_root: true
      fields:
        app: ${NOMAD_JOB_NAME}
    
    -
      paths:
        - /alloc/logs/*.stderr.[0-9]*
      exclude_files: ['\.fifo$']  
      type: filestream
      scan_frequency: 1s
      fields_under_root: true
      fields:
        app: ${NOMAD_JOB_NAME}
   
  output.logstash:
    hosts: ["IP:5044"]
    tls: disable 

  logging.level: debug
    
      template
      destination = "local/filebeat.yml"
      } 
    }
  }
}

These are my logs for filebeat

022-03-20T20:05:25.067Z	INFO	instance/beat.go:645	Home path: [/usr/share/filebeat] Config path: [/usr/share/filebeat] Data path: [/alloc/data/filebeat] Logs path: [/alloc/logs/filebeat]
2022-03-20T20:05:25.068Z	DEBUG	[beat]	instance/beat.go:697	Beat metadata path: /alloc/data/filebeat/meta.json
2022-03-20T20:05:25.078Z	INFO	instance/beat.go:653	Beat ID: 0c22006b-243c-488e-bd1d-0fdf5e2aef7e
2022-03-20T20:05:25.078Z	DEBUG	[seccomp]	seccomp/seccomp.go:117	Loading syscall filter	{"seccomp_filter": {"no_new_privs":true,"flag":"tsync","policy":{"default_action":"errno","syscalls":[{"names":["accept","accept4","access","arch_prctl","bind","brk","chmod","chown","clock_gettime","clone","close","connect","dup","dup2","epoll_create","epoll_create1","epoll_ctl","epoll_pwait","epoll_wait","exit","exit_group","fchdir","fchmod","fchmodat","fchown","fchownat","fcntl","fdatasync","flock","fstat","fstatfs","fsync","ftruncate","futex","getcwd","getdents","getdents64","geteuid","getgid","getpeername","getpid","getppid","getrandom","getrlimit","getrusage","getsockname","getsockopt","gettid","gettimeofday","getuid","inotify_add_watch","inotify_init1","inotify_rm_watch","ioctl","kill","listen","lseek","lstat","madvise","mincore","mkdirat","mmap","mprotect","munmap","nanosleep","newfstatat","open","openat","pipe","pipe2","poll","ppoll","pread64","pselect6","pwrite64","read","readlink","readlinkat","recvfrom","recvmmsg","recvmsg","rename","renameat","rt_sigaction","rt_sigprocmask","rt_sigreturn","sched_getaffinity","sched_yield","sendfile","sendmmsg","sendmsg","sendto","set_robust_list","setitimer","setsockopt","shutdown","sigaltstack","socket","splice","stat","statfs","sysinfo","tgkill","time","tkill","uname","unlink","unlinkat","wait4","waitid","write","writev"],"action":"allow"}]}}}
2022-03-20T20:05:25.078Z	INFO	[seccomp]	seccomp/seccomp.go:124	Syscall filter successfully installed
2022-03-20T20:05:25.078Z	INFO	[beat]	instance/beat.go:981	Beat info	{"system_info": {"beat": {"path": {"config": "/usr/share/filebeat", "data": "/alloc/data/filebeat", "home": "/usr/share/filebeat", "logs": "/alloc/logs/filebeat"}, "type": "filebeat", "uuid": "0c22006b-243c-488e-bd1d-0fdf5e2aef7e"}}}
2022-03-20T20:05:25.078Z	INFO	[beat]	instance/beat.go:990	Build info	{"system_info": {"build": {"commit": "1428d58cf2ed945441fb2ed03961cafa9e4ad3eb", "libbeat": "7.10.0", "time": "2020-11-09T19:57:04.000Z", "version": "7.10.0"}}}
2022-03-20T20:05:25.078Z	INFO	[beat]	instance/beat.go:993	Go runtime info	{"system_info": {"go": {"os":"linux","arch":"amd64","max_procs":2,"version":"go1.14.7"}}}
2022-03-20T20:05:25.079Z	INFO	[beat]	instance/beat.go:997	Host info	{"system_info": {"host": {"architecture":"x86_64","boot_time":"2022-03-11T14:44:18Z","containerized":true,"name":"51e75a9014f2","ip":["127.0.0.1/8","172.17.0.3/16"],"kernel_version":"4.15.0-171-generic","mac":["02:42:ac:11:00:03"],"os":{"family":"redhat","platform":"centos","name":"CentOS Linux","version":"7 (Core)","major":7,"minor":8,"patch":2003,"codename":"Core"},"timezone":"UTC","timezone_offset_sec":0}}}
2022-03-20T20:05:25.079Z	INFO	[beat]	instance/beat.go:1026	Process info	{"system_info": {"process": {"capabilities": {"inheritable":["chown","dac_override","fowner","fsetid","kill","setgid","setuid","setpcap","net_bind_service","net_raw","sys_chroot","mknod","audit_write","setfcap"],"permitted":null,"effective":null,"bounding":["chown","dac_override","fowner","fsetid","kill","setgid","setuid","setpcap","net_bind_service","net_raw","sys_chroot","mknod","audit_write","setfcap"],"ambient":null}, "cwd": "/usr/share/filebeat", "exe": "/usr/share/filebeat/filebeat", "name": "filebeat", "pid": 1, "ppid": 0, "seccomp": {"mode":"filter","no_new_privs":true}, "start_time": "2022-03-20T20:05:23.850Z"}}}
2022-03-20T20:05:25.079Z	INFO	instance/beat.go:299	Setup Beat: filebeat; Version: 7.10.0
2022-03-20T20:05:25.079Z	DEBUG	[beat]	instance/beat.go:325	Initializing output plugins
2022-03-20T20:05:25.080Z	DEBUG	[publisher]	pipeline/consumer.go:148	start pipeline event consumer
2022-03-20T20:05:25.080Z	INFO	[publisher]	pipeline/module.go:113	Beat name: 51e75a9014f2
2022-03-20T20:05:25.100Z	WARN	beater/filebeat.go:178	Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2022-03-20T20:05:25.100Z	INFO	instance/beat.go:455	filebeat start running.
2022-03-20T20:05:25.100Z	INFO	[monitoring]	log/log.go:118	Starting metrics logging every 30s
2022-03-20T20:05:25.100Z	DEBUG	[test]	registrar/migrate.go:304	isFile(/alloc/data/filebeat/registry) -> false
2022-03-20T20:05:25.108Z	DEBUG	[test]	registrar/migrate.go:304	isFile() -> false
2022-03-20T20:05:25.108Z	DEBUG	[test]	registrar/migrate.go:297	isDir(/alloc/data/filebeat/registry/filebeat) -> false
2022-03-20T20:05:25.108Z	DEBUG	[registrar]	registrar/migrate.go:84	Registry type '' found
2022-03-20T20:05:25.109Z	DEBUG	[test]	registrar/migrate.go:304	isFile(.bak) -> false
2022-03-20T20:05:25.118Z	INFO	memlog/store.go:119	Loading data file of '/alloc/data/filebeat/registry/filebeat' succeeded. Active transaction id=0
2022-03-20T20:05:25.119Z	INFO	memlog/store.go:124	Finished loading transaction log file for '/alloc/data/filebeat/registry/filebeat'. Active transaction id=0
2022-03-20T20:05:25.119Z	WARN	beater/filebeat.go:381	Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2022-03-20T20:05:25.120Z	INFO	[registrar]	registrar/registrar.go:109	States Loaded from registrar: 0
2022-03-20T20:05:25.120Z	INFO	[crawler]	beater/crawler.go:71	Loading Inputs: 8
2022-03-20T20:05:25.120Z	WARN	[input]	v2/loader.go:102	EXPERIMENTAL: The filestream input is experimental	{"input": "filestream", "stability": "Experimental", "deprecated": false}
2022-03-20T20:05:25.120Z	DEBUG	[scanner]	filestream/fswatch.go:253	recursive glob enabled
2022-03-20T20:05:25.120Z	INFO	[crawler]	beater/crawler.go:141	Starting input (ID: 17965352755492436510)
2022-03-20T20:05:25.120Z	WARN	[input]	v2/loader.go:102	EXPERIMENTAL: The filestream input is experimental	{"input": "filestream", "stability": "Experimental", "deprecated": false}
2022-03-20T20:05:25.120Z	DEBUG	[scanner]	filestream/fswatch.go:253	recursive glob enabled
2022-03-20T20:05:25.120Z	INFO	[crawler]	beater/crawler.go:141	Starting input (ID: 11064732163641410031)
2022-03-20T20:05:25.120Z	WARN	[input]	v2/loader.go:102	EXPERIMENTAL: The filestream input is experimental	{"input": "filestream", "stability": "Experimental", "deprecated": false}
2022-03-20T20:05:25.120Z	DEBUG	[scanner]	filestream/fswatch.go:253	recursive glob enabled
2022-03-20T20:05:25.120Z	INFO	[crawler]	beater/crawler.go:141	Starting input (ID: 4045421839173168800)
2022-03-20T20:05:25.120Z	WARN	[input]	v2/loader.go:102	EXPERIMENTAL: The filestream input is experimental	{"input": "filestream", "stability": "Experimental", "deprecated": false}
2022-03-20T20:05:25.120Z	DEBUG	[scanner]	filestream/fswatch.go:253	recursive glob enabled
2022-03-20T20:05:25.120Z	INFO	[crawler]	beater/crawler.go:141	Starting input (ID: 6774760737201568283)
2022-03-20T20:05:25.120Z	WARN	[input]	v2/loader.go:102	EXPERIMENTAL: The filestream input is experimental	{"input": "filestream", "stability": "Experimental", "deprecated": false}
2022-03-20T20:05:25.120Z	DEBUG	[scanner]	filestream/fswatch.go:253	recursive glob enabled
2022-03-20T20:05:25.120Z	INFO	[crawler]	beater/crawler.go:141	Starting input (ID: 17201009571870859417)
2022-03-20T20:05:25.120Z	WARN	[input]	v2/loader.go:102	EXPERIMENTAL: The filestream input is experimental	{"input": "filestream", "stability": "Experimental", "deprecated": false}
2022-03-20T20:05:25.120Z	DEBUG	[scanner]	filestream/fswatch.go:253	recursive glob enabled
2022-03-20T20:05:25.120Z	INFO	[crawler]	beater/crawler.go:141	Starting input (ID: 35512087561084779)
2022-03-20T20:05:25.120Z	WARN	[input]	v2/loader.go:102	EXPERIMENTAL: The filestream input is experimental	{"input": "filestream", "stability": "Experimental", "deprecated": false}
2022-03-20T20:05:25.120Z	DEBUG	[scanner]	filestream/fswatch.go:253	recursive glob enabled
2022-03-20T20:05:25.120Z	INFO	[crawler]	beater/crawler.go:141	Starting input (ID: 3609320465969190629)
2022-03-20T20:05:25.120Z	WARN	[input]	v2/loader.go:102	EXPERIMENTAL: The filestream input is experimental	{"input": "filestream", "stability": "Experimental", "deprecated": false}
2022-03-20T20:05:25.120Z	DEBUG	[scanner]	filestream/fswatch.go:253	recursive glob enabled
2022-03-20T20:05:25.120Z	INFO	[crawler]	beater/crawler.go:141	Starting input (ID: 2615655676442875789)
2022-03-20T20:05:25.120Z	INFO	[crawler]	beater/crawler.go:108	Loading and starting Inputs completed. Enabled inputs: 8
2022-03-20T20:05:25.121Z	DEBUG	[registrar]	registrar/registrar.go:140	Starting Registrar
2022-03-20T20:05:25.121Z	INFO	[input.filestream]	compat/compat.go:110	Input filestream starting
2022-03-20T20:05:25.121Z	DEBUG	[input.filestream]	filestream/prospector.go:77	Starting prospector	{"prospector": "file_prospector"}
2022-03-20T20:05:25.121Z	INFO	[input.filestream]	compat/compat.go:110	Input filestream starting
2022-03-20T20:05:25.121Z	DEBUG	[input.filestream]	filestream/prospector.go:77	Starting prospector	{"prospector": "file_prospector"}
2022-03-20T20:05:25.121Z	INFO	[input.filestream]	compat/compat.go:110	Input filestream starting
2022-03-20T20:05:25.121Z	DEBUG	[input.filestream]	filestream/prospector.go:77	Starting prospector	{"prospector": "file_prospector"}
2022-03-20T20:05:25.121Z	INFO	[input.filestream]	compat/compat.go:110	Input filestream starting
2022-03-20T20:05:25.121Z	DEBUG	[input.filestream]	filestream/prospector.go:77	Starting prospector	{"prospector": "file_prospector"}
2022-03-20T20:05:25.121Z	INFO	[input.filestream]	compat/compat.go:110	Input filestream starting
2022-03-20T20:05:25.121Z	DEBUG	[input.filestream]	filestream/prospector.go:77	Starting prospector	{"prospector": "file_prospector"}
2022-03-20T20:05:25.121Z	INFO	[input.filestream]	compat/compat.go:110	Input filestream starting
2022-03-20T20:05:25.121Z	DEBUG	[input.filestream]	filestream/prospector.go:77	Starting prospector	{"prospector": "file_prospector"}
2022-03-20T20:05:25.121Z	INFO	[input.filestream]	compat/compat.go:110	Input filestream starting
2022-03-20T20:05:25.121Z	DEBUG	[input.filestream]	filestream/prospector.go:77	Starting prospector	{"prospector": "file_prospector"}
2022-03-20T20:05:25.121Z	INFO	[input.filestream]	compat/compat.go:110	Input filestream starting
2022-03-20T20:05:25.121Z	DEBUG	[input.filestream]	filestream/prospector.go:77	Starting prospector	{"prospector": "file_prospector"}
2022-03-20T20:05:35.121Z	INFO	[file_watcher]	filestream/fswatch.go:131	Start next scan
2022-03-20T20:05:35.121Z	DEBUG	[file_watcher]	filestream/fswatch.go:190	Found 0 paths
2022-03-20T20:05:35.121Z	INFO	[file_watcher]	filestream/fswatch.go:131	Start next scan
2022-03-20T20:05:35.121Z	DEBUG	[file_watcher]	filestream/fswatch.go:190	Found 1 paths
2022-03-20T20:05:35.121Z	INFO	[file_watcher]	filestream/fswatch.go:131	Start next scan
2022-03-20T20:05:35.121Z	DEBUG	[file_watcher]	filestream/fswatch.go:190	Found 0 paths
2022-03-20T20:05:35.121Z	INFO	[file_watcher]	filestream/fswatch.go:131	Start next scan
2022-03-20T20:05:35.122Z	DEBUG	[file_watcher]	filestream/fswatch.go:190	Found 1 paths
2022-03-20T20:05:35.121Z	DEBUG	[input.filestream]	filestream/prospector.go:105	A new file /alloc/logs/filebeat.stdout.0 has been found	{"prospector": "file_prospector"}
2022-03-20T20:05:35.122Z	DEBUG	[input.filestream]	input-logfile/harvester.go:63	Starting harvester for file	{"source": "filestream::native::262992-64769"}
2022-03-20T20:05:35.122Z	DEBUG	[input.filestream]	filestream/input.go:176	newLogFileReader with config.MaxBytes:10485760	{"path": "/alloc/logs/filebeat.stdout.0", "state-id": "filestream::native::262992-64769"}
2022-03-20T20:05:35.122Z	DEBUG	[input.filestream]	filestream/filestream.go:133	End of file reached: /alloc/logs/filebeat.stdout.0; Backoff now.	{"path": "/alloc/logs/filebeat.stdout.0", "state-id": "filestream::native::262992-64769"}
2022-03-20T20:05:35.122Z	INFO	[file_watcher]	filestream/fswatch.go:131	Start next scan
2022-03-20T20:05:35.122Z	DEBUG	[file_watcher]	filestream/fswatch.go:190	Found 0 paths
2022-03-20T20:05:35.122Z	INFO	[file_watcher]	filestream/fswatch.go:131	Start next scan
2022-03-20T20:05:35.122Z	DEBUG	[file_watcher]	filestream/fswatch.go:190	Found 0 paths
2022-03-20T20:05:35.122Z	INFO	[file_watcher]	filestream/fswatch.go:131	Start next scan
2022-03-20T20:05:35.122Z	DEBUG	[file_watcher]	filestream/fswatch.go:190	Found 0 paths
2022-03-20T20:05:35.122Z	INFO	[file_watcher]	filestream/fswatch.go:131	Start next scan
2022-03-20T20:05:35.122Z	DEBUG	[file_watcher]	filestream/fswatch.go:190	Found 0 paths
2022-03-20T20:05:35.122Z	DEBUG	[input.filestream]	filestream/prospector.go:105	A new file /alloc/logs/filebeat.stderr.0 has been found	{"prospector": "file_prospector"}
2022-03-20T20:05:35.123Z	DEBUG	[input.filestream]	input-logfile/harvester.go:63	Starting harvester for file	{"source": "filestream::native::262994-64769"}
2022-03-20T20:05:35.123Z	DEBUG	[input.filestream]	filestream/input.go:176	newLogFileReader with config.MaxBytes:10485760	{"path": "/alloc/logs/filebeat.stderr.0", "state-id": "filestream::native::262994-64769"}
2022-03-20T20:05:35.123Z	DEBUG	[input.filestream]	filestream/filestream.go:133	End of file reached: /alloc/logs/filebeat.stderr.0; Backoff now.	{"path": "/alloc/logs/filebeat.stderr.0", "state-id": "filestream::native::262994-64769"}
2022-03-20T20:05:36.122Z	DEBUG	[input.filestream]	filestream/filestream.go:133	End of file reached: /alloc/logs/filebeat.stdout.0; Backoff now.	{"path": "/alloc/logs/filebeat.stdout.0", "state-id": "filestream::native::262992-64769"}
2022-03-20T20:05:36.123Z	DEBUG	[input.filestream]	filestream/filestream.go:133	End of file reached: /alloc/logs/filebeat.stderr.0; Backoff now.	{"path": "/alloc/logs/filebeat.stderr.0", "state-id": "filestream::native::262994-64769"}
2022-03-20T20:05:38.123Z	DEBUG	[input.filestream]	filestream/filestream.go:133	End of file reached: /alloc/logs/filebeat.stdout.0; Backoff now.	{"path": "/alloc/logs/filebeat.stdout.0", "state-id": "filestream::native::262992-64769"}
2022-03-20T20:05:38.123Z	DEBUG	[input.filestream]	filestream/filestream.go:133	End of file reached: /alloc/logs/filebeat.stderr.0; Backoff now.	{"path": "/alloc/logs/filebeat.stderr.0", "state-id": "filestream::native::262994-64769"}
2022-03-20T20:05:42.123Z	DEBUG	[input.filestream]	filestream/filestream.go:133	End of file reached: /alloc/logs/filebeat.stdout.0; Backoff now.	{"path": "/alloc/logs/filebeat.stdout.0", "state-id": "filestream::native::262992-64769"}
2022-03-20T20:05:42.124Z	DEBUG	[input.filestream]	filestream/filestream.go:133	End of file reached: /alloc/logs/filebeat.stderr.0; Backoff now.	{"path": "/alloc/logs/filebeat.stderr.0", "state-id": "filestream::native::262994-64769"}
2022-03-20T20:05:45.121Z	INFO	[file_watcher]	filestream/fswatch.go:131	Start next scan
2022-03-20T20:05:45.121Z	INFO	[file_watcher]	filestream/fswatch.go:131	Start next scan
2022-03-20T20:05:45.121Z	DEBUG	[file_watcher]	filestream/fswatch.go:190	

These are my Logstash logs:

 java.lang.Thread.run(java/lang/Thread.java:834)
logstash_1       | [2022-03-20T23:08:40,053][INFO ][org.logstash.beats.Server][main][6410db8447a1895e2ed1c329f1fd3b2e6a1d69f7c761ccdcd979384a19040ac9] Starting server on port: 5044
logstash_1       | [2022-03-20T23:08:46,084][ERROR][logstash.javapipeline    ][main][6410db8447a1895e2ed1c329f1fd3b2e6a1d69f7c761ccdcd979384a19040ac9] A plugin had an unrecoverable error. Will restart this plugin.
logstash_1       |   Pipeline_id:main
logstash_1       |   Plugin: <LogStash::Inputs::Beats port=>5044, id=>"6410db8447a1895e2ed1c329f1fd3b2e6a1d69f7c761ccdcd979384a19040ac9", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_d9d0638b-4c98-40b3-b420-0a4bb69c4e8f", enable_metric=>true, charset=>"UTF-8">, host=>"0.0.0.0", ssl=>false, add_hostname=>false, ssl_verify_mode=>"none", ssl_peer_metadata=>false, include_codec_tag=>true, ssl_handshake_timeout=>10000, tls_min_version=>1, tls_max_version=>1.2, cipher_suites=>["TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256"], client_inactivity_timeout=>60, executor_threads=>2>
logstash_1       |   Error: Address already in use
logstash_1       |   Exception: Java::JavaNet::BindException
logstash_1       |   Stack: sun.nio.ch.Net.bind0(Native Method)
logstash_1       | sun.nio.ch.Net.bind(sun/nio/ch/Net.java:455)
logstash_1       | sun.nio.ch.Net.bind(sun/nio/ch/Net.java:447)
logstash_1       | sun.nio.ch.ServerSocketChannelImpl.bind(sun/nio/ch/ServerSocketChannelImpl.java:227)
logstash_1       | io.netty.channel.socket.nio.NioServerSocketChannel.doBind(io/netty/channel/socket/nio/NioServerSocketChannel.java:134)
logstash_1       | io.netty.channel.AbstractChannel$AbstractUnsafe.bind(io/netty/channel/AbstractChannel.java:550)
logstash_1       | io.netty.channel.DefaultChannelPipeline$HeadContext.bind(io/netty/channel/DefaultChannelPipeline.java:1334)
logstash_1       | io.netty.channel.AbstractChannelHandlerContext.invokeBind(io/netty/channel/AbstractChannelHandlerContext.java:506)
logstash_1       | io.netty.channel.AbstractChannelHandlerContext.bind(io/netty/channel/AbstractChannelHandlerContext.java:491)
logstash_1       | io.netty.channel.DefaultChannelPipeline.bind(io/netty/channel/DefaultChannelPipeline.java:973)
logstash_1       | io.netty.channel.AbstractChannel.bind(io/netty/channel/AbstractChannel.java:248)
logstash_1       | io.netty.bootstrap.AbstractBootstrap$2.run(io/netty/bootstrap/AbstractBootstrap.java:356)
logstash_1       | io.netty.util.concurrent.AbstractEventExecutor.safeExecute(io/netty/util/concurrent/AbstractEventExecutor.java:164)
logstash_1       | io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(io/netty/util/concurrent/SingleThreadEventExecutor.java:472)
logstash_1       | io.netty.channel.nio.NioEventLoop.run(io/netty/channel/nio/NioEventLoop.java:500)
logstash_1       | io.netty.util.concurrent.SingleThreadEventExecutor$4.run(io/netty/util/concurrent/SingleThreadEventExecutor.java:989)
logstash_1       | io.netty.util.internal.ThreadExecutorMap$2.run(io/netty/util/internal/ThreadExecutorMap.java:74)
logstash_1       | io.netty.util.concurrent.FastThreadLocalRunnable.run(io/netty/util/concurrent/FastThreadLocalRunnable.java:30)
logstash_1       | java.lang.Thread.run(java/lang/Thread.java:834)

This is my docker-compose file

version: '3.7'
services:
  mongo:
    image: mongo:3
    user: root
    networks:
      - graylog
    ports:
      - 27017:27017
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.10.0
    ports:
      - 9200:9200
    environment:
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
    volumes:
      - type: bind
        source: /srv/docker/elasticsearch/elasticsearch.yml
        target: /usr/share/elasticsearch/config/elasticsearch.yml

      - type: bind
        source: /srv/docker/elasticsearch/limits.conf
        target: /etc/security/limits.conf
    networks:
      - graylog
    ulimits:
      memlock:
        soft: -1
        hard: -1
    mem_limit: 1g
 logstash:
    image: docker.elastic.co/logstash/logstash:7.10.0
    user: root
    volumes:
      - type: bind
        source: /srv/docker/logstash/logstash.conf
        target: /usr/share/logstash/pipeline/logstash.conf:ro

      - type: bind
        source: /srv/docker/logstash/logstash.yml
        target: /usr/share/logstash/config/logstash.yml:ro

#      - type: bind
#        source: /srv/docker/logstash/pipelines.yml
#        target: /usr/share/logstash/config/pipelines.yml:ro
    ports:
      - 5000:5000/tcp
      - 5000:5000/udp
    environment:
      - "LS_JAVA_OPTS=-Xmx1024m -Xms512m"
      - XPACK_MONITORING_ENABLED=false
      - xpack.monitoring.enabled=false
    networks:
      - graylog
    depends_on:
      - elasticsearch

  graylog:
    image: graylog/graylog:4.2.6
    user: root
    volumes:
      - type: bind
        source: /srv/docker/graylog/graylog.conf
        target: /usr/share/graylog/data/config/graylog.conf
    environment:
      - GRAYLOG_ELASTICSEARCH_VERSION=7
      - GRAYLOG_REST_LISTEN_URI=https://0.0.0.0:9000/api/
    networks:
      - graylog
 links:
      - mongo:mongo
      - elasticsearch
      - logstash
    depends_on:
      - mongo
      - elasticsearch
      - logstash
    ports:
      # Graylog web interface and REST API
      - "9000:9000"
      #Beats
      - 5044:5044
networks:
  graylog:
    driver: bridge

This is my logstash.conf

input {
    beats {
        port => "5044"
        host => "0.0.0.0"
        ssl => false
    }
}

filter {}

output {

    elasticsearch {
        hosts => [IP:9200"]
        ilm_enabled => false
        index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
    }

stdout {
    codec => rubydebug
    }
}

I am running graylog on 5044 port, but I guess that is how it is supposed to be or else how will graylog ingest logs. I am sorry if its incorrect because I am extremely new in this.

Summary: Filebeat is sending logs on port 5044> logstash is listening on 5044 (in logstash.conf)> output of logstash is Elasticsearch(9200). I am missing out on how graylog is supposed to ingest the logs here. Any help is appreciated.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.