esbjs:/var/log/logstash # tail -50 logstash-plain.log
[2018-06-20T14:43:05,031][INFO ][logstash.agent ] No config files found in path {:path=>"/etc/logstash/conf.d/"}
[2018-06-20T14:43:05,033][ERROR][logstash.agent ] failed to fetch pipeline configuration {:message=>"No config files found: /etc/logstash/conf.d. Can you make sure this path is a logstash config file?"}
[2018-06-20T14:43:14,409][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2018-06-20T14:43:14,412][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2018-06-20T14:43:14,495][INFO ][logstash.agent ] No config files found in path {:path=>"/etc/logstash/conf.d/"}
[2018-06-20T14:43:14,497][ERROR][logstash.agent ] failed to fetch pipeline configuration {:message=>"No config files found: /etc/logstash/conf.d. Can you make sure this path is a logstash config file?"}
[2018-06-20T14:43:23,604][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2018-06-20T14:43:23,607][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2018-06-20T14:43:23,683][INFO ][logstash.agent ] No config files found in path {:path=>"/etc/logstash/conf.d/"}
[2018-06-20T14:43:23,687][ERROR][logstash.agent ] failed to fetch pipeline configuration {:message=>"No config files found: /etc/logstash/conf.d. Can you make sure this path is a logstash config file?"}
[2018-06-20T14:43:33,296][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2018-06-20T14:43:33,299][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2018-06-20T14:43:33,372][INFO ][logstash.agent ] No config files found in path {:path=>"/etc/logstash/conf.d/"}
[2018-06-20T14:43:33,374][ERROR][logstash.agent ] failed to fetch pipeline configuration {:message=>"No config files found: /etc/logstash/conf.d. Can you make sure this path is a logstash config file?"}
[2018-06-20T14:43:43,245][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2018-06-20T14:43:43,248][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2018-06-20T14:43:43,327][INFO ][logstash.agent ] No config files found in path {:path=>"/etc/logstash/conf.d/*"}
[2018-06-20T14:43:43,330][ERROR][logstash.agent ] failed to fetch pipeline configuration {:message=>"No config files found: /etc/logstash/conf.d. Can you make sure this path is a logstash config file?"}
[2018-06-20T14:43:52,422][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
I have kept conf file in /etc/logstash/conf.d directory but still not able to create index for filebeat. There is no logstash logs of today. The logs wich I have sent you was yesterday logs
I have checked that logs and given all the permission to the queue directory. That error is resolved but now I am getting different error. I am sending you that logs, please check it.
Thank you in advance.
[2018-07-16T12:15:25,092][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_PSK_WITH_AES_256_CBC_SHA => PSK-AES256-CBC-SHA
[2018-07-16T12:15:25,092][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_PSK_WITH_AES_256_CBC_SHA => PSK-AES256-CBC-SHA
[2018-07-16T12:15:25,092][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_RSA_WITH_3DES_EDE_CBC_SHA => DES-CBC3-SHA
[2018-07-16T12:15:25,092][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_RSA_WITH_3DES_EDE_CBC_SHA => DES-CBC3-SHA
[2018-07-16T12:15:25,092][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_RSA_WITH_RC4_128_SHA => RC4-SHA
[2018-07-16T12:15:25,092][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_RSA_WITH_RC4_128_SHA => RC4-SHA
[2018-07-16T12:15:25,092][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_PSK_WITH_RC4_128_SHA => PSK-RC4-SHA
[2018-07-16T12:15:25,092][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_PSK_WITH_RC4_128_SHA => PSK-RC4-SHA
[2018-07-16T12:15:25,092][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_RSA_WITH_RC4_128_MD5 => RC4-MD5
[2018-07-16T12:15:25,093][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_RSA_WITH_RC4_128_MD5 => RC4-MD5
[2018-07-16T12:15:25,094][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5045"}
[2018-07-16T12:15:25,102][DEBUG][io.netty.channel.MultithreadEventLoopGroup] -Dio.netty.eventLoopThreads: 4
[2018-07-16T12:15:25,125][DEBUG][io.netty.channel.nio.NioEventLoop] -Dio.netty.noKeySetOptimization: false
[2018-07-16T12:15:25,125][DEBUG][io.netty.channel.nio.NioEventLoop] -Dio.netty.selectorAutoRebuildThreshold: 512
[2018-07-16T12:15:25,136][INFO ][logstash.pipeline ] Pipeline main started
[2018-07-16T12:15:25,146][INFO ][org.logstash.beats.Server] Starting server on port: 5045
[2018-07-16T12:15:25,173][DEBUG][logstash.agent ] Starting puma
[2018-07-16T12:15:25,174][DEBUG][io.netty.channel.DefaultChannelId] -Dio.netty.processId: 53531 (auto-detected)
[2018-07-16T12:15:25,177][DEBUG][io.netty.util.NetUtil ] Loopback interface: lo (lo, 0:0:0:0:0:0:0:1%lo)
[2018-07-16T12:15:25,178][DEBUG][io.netty.util.NetUtil ] /proc/sys/net/core/somaxconn: 128
[2018-07-16T12:15:25,178][DEBUG][io.netty.channel.DefaultChannelId] -Dio.netty.machineId: 00:15:5d:ff:fe:76:0d:15 (auto-detected)
[2018-07-16T12:15:25,182][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2018-07-16T12:15:25,195][DEBUG][logstash.api.service ] [api-service] start
[2018-07-16T12:15:25,216][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9601}
[2018-07-16T12:15:25,216][DEBUG][logstash.api.service ] [api-service] start
[2018-07-16T12:15:25,219][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9601}
[2018-07-16T12:15:30,136][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2018-07-16T12:15:35,137][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
It seems you implement Cipher suite mapping: TLS_RSA_WITH_RC4_128_MD5 in your config file. please check and share that config file so that i can chech and revert accordingly.
I kept logstash in debug mode but I am not using any encryption algorithms like Cipher suite mapping: TLS_RSA_WITH_RC4_128_MD5 . What this error means [2018-07-16T17:47:16,495][DEBUG][logstash.pipeline ] Pushing flush onto pipeline. I am sending you logstash.conf file.
i've checked the same and that seems ok and u haven't configure cipher suites anywhere.
can you please check /var/log/messages while starting filebeat and logstash. because if there is some indentation in filebeat.yml then messages logs will show that info. please check and let me know.
Today I restarted logstash and new message generated as below
2018-07-17T10:46:46.243370+05:30 esbjs liblogging-stdlog: -- MARK --
2018-07-17T10:49:13.042966+05:30 esbjs systemd[1]: Stopping logstash...
2018-07-17T10:49:19.462226+05:30 esbjs systemd[1]: Stopped logstash.
2018-07-17T10:49:19.478100+05:30 esbjs systemd[1]: Started logstash.
2018-07-17T10:49:30.116567+05:30 esbjs logstash[35617]: Sending Logstash's logs to /var/log/logstash which is now configured via log4j2.properties
2018-07-17T11:00:01.757300+05:30 esbjs cron[36009]: pam_unix(crond:session): session opened for user root by (uid=0)
2018-07-17T11:00:01.772059+05:30 esbjs systemd[1]: Started Session 8536 of user root.
2018-07-17T11:00:01.793062+05:30 esbjs CRON[36009]: pam_unix(crond:session): session closed for user root
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.