I have a visualization which can visualize term deposits (TD) of a bank. (Aka Fixed deposit). The maturity date of a TD account falls into one of the following buckets:
"7 days to 14 days"
"15 days to 29 days"
"30 days to 45 days"
"46 days to 60 days"
"61 days to 184 days"
"185 days to 289 days"
"290 days to 364 days"
"390 days to 17 months 29 days"
"18 months to 2 years"
"2 years 1 day to 3 years"
"3 years 1 day to 5 years"
"5 years 1 day to 10 years"
I cannot maintain the above order in the visualization.
For e.g. the maturity date in the log data may be specified as month, days, or both. For e.g. 734 days translates to 2 years and 2 days, thus it falls into the 2 years 1 day to 3 years bucket.
I have a ruby filter in logstash which runs custom ruby code and assigns the bucket.
However, in kibana when I try to visualize the total TD amounts (i.e cumulative sum) against a bucket, its order is jumbled...
Can you post a screenshot of your visualization? (with any sensitive data removed, it's fine) What are you using as your sort field for the visualization?
"buckets" end up jumbled. The green bar is the sum of amount values. Its huge only because we simulated the data - created a TD account every second randomly.
"bucket" is not part of the original log data. I add it using a ruby filter in logstash. Below is the ruby code that does the same:
# require 'json'
def register_params(params)
end
def filter(event)
bucket = {
"7 days to 14 days" => { "start" => 604800000, "end" => 1209600000 },
"15 days to 29 days" => { "start" => 1296000000, "end" => 2505600000 },
"30 days to 45 days" => { "start" => 2592000000, "end" => 3888000000 },
"46 days to 60 days" => { "start" => 3974400000, "end" => 5184000000 },
"61 days to 184 days" => { "start" => 5270400000, "end" => 15897600000 },
"185 days to 289 days" => { "start" => 15984000000, "end" => 24969600000 },
"290 days to 364 days" => { "start" => 25056000000, "end" => 31449600000 },
"390 days to 17 months 29 days" => { "start" => 33696000000, "end" => 46569600000 },
"18 months to 2 years" => { "start" => 46656000000, "end" => 62208000000 },
"2 years 1 day to 3 years" => { "start" => 62294400000, "end" => 93312000000 },
"3 years 1 day to 5 years" => { "start" => 93398400000, "end" => 155520000000 },
"5 years 1 day to 10 years" => { "start" => 155606400000, "end" => 311040000000 }
}
# days = event["data"]["deposit_period_days"]
# mths = event["data"]["deposit_period_mths"]
# puts "Days: #{days}, Months: #{mths}"
mths = event.get("[data][deposit_period_mths]")
days = event.get("[data][deposit_period_days]")
one_day = 1 * 24 * 60 * 60 * 1000
period_length = mths * 30 * one_day + days * one_day
# puts "Months: #{mths}, Days: #{days}"
bucket.each do | key, value |
end_val = value["end"];
if period_length <= end_val
event.set("bucket", key)
break
end
end
return [event]
end
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.