Hi all,
I have a Logstash installation that read JSON data from a Redis cache and send the events to an Elastic.
Configuration it's very simple:
input {
redis {
host => "my-redis"
data_type => "list"
port => "6380"
key => "my-redis-key"
password => "my-password"
ssl => true
}
}
output {
elasticsearch {
index => "my-redis-log-%{+YYYY.MM.dd}"
hosts => ["elastic-1:9200","elastic-2:9200","elastic-3:9200",]
}
}
By default Redis input codec is JSON, this is good for me because messages are JSON.
But the code that generates the JSON message in Redis has a bug, and fix at this time is not simple.
{
"@timestamp": "2019-01-02T23:18:19.766+00:00",
"@version": "1",
"message": "Correct JSON",
"logger_name": "com.redis.correct.controller.Message",
"thread_name": "http-nio-8080-exec-12",
"version": "7.2.3",
"ip": "127.0.0.1",
"request_id": "djqfqtjlz3vcz5k92ve3njdmg",
"user": "local-user-id",
"step": "input",
"payload": {
"number": "2",
"statusName": "full"
},
"name": "JSON",
"env": "prod"
}
The value of payload normally it's a nested JSON. So for our logic this it's correct.
But in some event payload field it's written as string:
{
"@timestamp": "2019-01-02T22:18:19.766+00:00",
"message": "Correct JSON",
"logger_name": "com.redis.correct.controller.Message",
"thread_name": "http-nio-8080-exec-12",
"version": "7.2.3",
"ip": "127.0.0.1",
"request_id": "my-reuqest-id",
"user": "local-user-id",
"step": "input",
"payload": "{"number":"0","statusName":"null"}",
"name": "JSON",
"env": "prod"
}
This kind of events is rejected by Elastic during the indexing process.
I have not found a solution at this time to mutate payload value when it's written as a string. Can you give me some hints to solve this problem, and correctly index all documents? (A bug in code side as told, it's not possible...)