How to avoid using BigDecimal scientific data format


(Georgiana Ciuchi) #1

I am using Logstash 1.5.5, Kafka as input using JSON codec, a structured message is received by Logstash:
{
"app_id" ": "myapp"
......
"log_message": {
"id": "1234",
"code": "whatever",
"amount": 284.84
}
}

I am converting log_message to String using JSON.dump like this:
string_message = JSON.dump event["log_message"];

string_message obtained value is {"id": "1234", "code": "whatever","amount": "0.28484E3"}

There is a way to avoid this transformation to scientific format of BigDecimal into string_message? I need to preserve amount value as 284.84

that log_message JSON message value is variable as structure, decimal values could be present on several message levels with variable attribute name.

There is a way to have a global setting about what decimal format should be used?


(system) #2

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.