Sending logs from filebeat to redis and then logstash

I am trying to send my logs from filebeat into redis and then from redis into logstash and elasticsearch.
My log file is in json format. What configuration is to be made in my filebeat.yml so that the output is redis. I am getting the error as
( Failed to RPUSH to redis list with ERR wrong number of arguments for 'rpush' command
2018-05-22T11:15:43.063+0530 ERROR pipeline/output.go:92
Failed to publish events: ERR wrong number of arguments for 'rpush' command).
Also what configuartion is to be made in my logstash.yml when my input is redis.

Hi,

Could you please brief your use case that why are you implementing this mean what is the requirement and paste your pipeline here so that we can check the issue.

Regards,

i want to make visualizations on kibana of my logs. the pipeline is as follows :
Filebeat - Redis - Logstash - Elasticsearch - Kibana.

The reason i have included redis in my pipeline is beacause it acts as a buffer and hence in case of data loss we can approach redis for that. But i am unable to send data from filebeat to redis itself.

Filebeat - Logstash - Elasticsearch - Kibana this pipeline is working fine. But when I am using redis, I am unable to load data from filebeat to redis.

I have attached the log file which m trying to send in the pipeline. Please provide the solution.

2018-05-18 09:29:24,387 [main] INFO org.mule.module.launcher.application.DefaultMuleApplication -
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

  • Initializing app 'RealtimeFlow' +
    ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
    2018-05-18 09:29:24,543 [main] INFO org.mule.lifecycle.AbstractLifecycleManager - Initialising RegistryBroker
    2018-05-18 09:29:24,636 [main] INFO org.mule.module.extension.internal.manager.DefaultExtensionManager - Starting discovery of extensions
    2018-05-18 09:29:24,745 [main] INFO org.mule.module.extension.internal.manager.DefaultExtensionManager - Discovered 1 extensions
    2018-05-18 09:29:24,745 [main] INFO org.mule.module.extension.internal.manager.DefaultExtensionManager - Registering extension validation (version 3.8)
    2018-05-18 09:29:24,809 [main] INFO org.mule.config.spring.MuleArtifactContext - Refreshing org.mule.config.spring.MuleArtifactContext@9573b3b: startup date [Fri May 18 09:29:24 IST 2018]; root of context hierarchy
    2018-05-18 09:29:26,353 [main] INFO org.mule.config.spring.processors.NoDevkitInjectorProcessor - JSR-330 'javax.inject.Inject' annotation found and supported for autowiring
    2018-05-18 09:29:31,942 [main] INFO org.mule.module.management.agent.WrapperManagerAgent - This JVM hasn't been launched by the wrapper, the agent will not run.
    2018-05-18 09:29:31,973 [main] INFO org.mule.DefaultMuleContext -

  • Application: RealtimeFlow *
  • OS encoding: , Mule encoding: UTF-8 *
  •                                                                *
    
  • Agents Running: *
  • JMX Agent *
  • DevKit Extension Information *
  • Batch module default engine *
  • Wrapper Manager *

2018-05-18 09:29:41,140 [[RealtimeFlow].HTTP_Listener_Configuration.worker.01] INFO com.mulesoft.weave.mule.utils.MuleWeaveFactory$ - MimeType was not resolved '/' delegating to Java.
2018-05-18 09:29:41,141 [[RealtimeFlow].HTTP_Listener_Configuration.worker.01] INFO org.mule.api.processor.LoggerMessageProcessor -
{"versionId":"1.0","environment":{"organizationId":"ejgje","businessUnitId":"dfsf","projectId":"dvbfsd","environmentId":"DEV","serverId":"10.45.43.223"},"interface":{"interfaceId":"REALTIME_SAP_interface","sourceId":"SFDC","targetId":"SAP"},"transactionDetail":{"correlationId":"e425d140-5a4f-11e8-81ec-320920524153","businessId":null,"executionTime":"2018-05-18 09:29:41:140","executionPoint":"Execution staring point of REALTIME_SAP_interface ","executionMessage":"RealTime Main Flow has been Started and setting the flow vars for logging purpose","executionStatus":"INPROCESS","payload":{}},"error":{"errorType":null,"errorCode":null,"errorText":null}}
2018-05-18 09:29:41,157 [[RealtimeFlow].HTTP_Listener_Configuration.worker.01] INFO com.mulesoft.weave.mule.utils.MuleWeaveFactory$ - MimeType was not resolved '/' delegating to Java.

Hi,

Could you please paste your filebeat.yml and redis config file here for further invesigation.

Thanks,

Hi,

Below is the configuration of filebeat.yml

filebeat.prospectors:

  • type: log
    enabled: true
    paths:
    • C:\Users\adfgdhchg\Documents\My Received Files\json.log

filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false

filebeat.modules:

  • module: redis
    log:
    enabled: true
    slowlog:
    enabled: true

#------------------------------- Redis output ----------------------------------
output.redis:
hosts: ["localhost:6379"]
key: "logs"
db: 0
timeout: 5

And in redis I didnt make any configuration.

In logstash I have given input as

input {
redis {
hosts => "localhost:6379"
data_type => "list"
key => "logs"
}
}

and output as

output{
elasticsearch {
hosts => "localhost:9200"
index => "finalmule_logs"
user=> elastic
password=>elastic
}
stdout { }
}

Thank you

Hi,

Hope you have checked the version compatibility if not please check with below link.

https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-module-redis.html#_compatibility_14

And please paste your filebeat logs here while starting the service.

Regards,

Hi,
My redis version is 2.4.5 and F E L K version is 6.2.4 I followed the steps on the provided link but still i am getting the same error( Failed to rpush to redis list). Please provide me the solution.

Hi,

Please let me know below details.

Are you able to connect the redis server from filebeat (please check with telnet) ?

could you please check with increase the below timeout parameter.

if you still getting the error please paste your debug logs here for more investigation.

Regards,

Hi,
I increased the timeout paramater value also still i am getting the same error.

the error is :
Failed to rpush to redis list.
Failed to publish events.
Please help me out with this.
Thanks.

Hi,

Have you checked the below details. Please check if not.

Regards,

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.