Logstash is replacing : with => How can I get : on the output
My input file:
{
"customfield_10150": [
{"key": "caanyimi",
"displayName": "Anyimi, Charles"}
]
}
Output:
"name""=>""caanyimi"",
""emailAddress""=>""charles.anyimi@intel.com"
Logstash is replacing : with => How can I get : on the output
My input file:
{
"customfield_10150": [
{"key": "caanyimi",
"displayName": "Anyimi, Charles"}
]
}
Output:
"name""=>""caanyimi"",
""emailAddress""=>""charles.anyimi@intel.com"
Hi @Raj_Sekhar,
Could you please explain bit more about your use case what you are trying to do and what you want to achieve.
Regards,
Harsh Bajaj
Hi Harsh,
I have an nested array of values inside the json.
When I am trying to write to a csv file, the colons are getting replaced with =>
"customfield_10150": [
{
"key": "caanyimi",
"displayName": "Anyimi, Charles",
"self": "https://nsg-jira.intel.com/rest/api/2/user?username=caanyimi",
"avatarUrls": {
"16x16": "https://nsg-jira.intel.com/secure/useravatar?size=xsmall&ownerId=caanyimi&avatarId=18136",
"48x48": "https://nsg-jira.intel.com/secure/useravatar?ownerId=caanyimi&avatarId=18136",
"32x32": "https://nsg-jira.intel.com/secure/useravatar?size=medium&ownerId=caanyimi&avatarId=18136",
"24x24": "https://nsg-jira.intel.com/secure/useravatar?size=small&ownerId=caanyimi&avatarId=18136"
},
"active": true,
"name": "caanyimi",
"timeZone": "US/Pacific",
"emailAddress": "charles.anyimi@intel.com"
},
{
"key": "dablunde",
"displayName": "Blunden, David",
"self": "https://nsg-jira.intel.com/rest/api/2/user?username=dablunde",
"avatarUrls": {
"16x16": "https://www.gravatar.com/avatar/28dd9333e6b5dc333179817530ded97e?d=mm&s=16",
"48x48": "https://www.gravatar.com/avatar/28dd9333e6b5dc333179817530ded97e?d=mm&s=48",
"32x32": "https://www.gravatar.com/avatar/28dd9333e6b5dc333179817530ded97e?d=mm&s=32",
"24x24": "https://www.gravatar.com/avatar/28dd9333e6b5dc333179817530ded97e?d=mm&s=24"
},
"active": true,
"name": "dablunde",
"timeZone": "US/Pacific",
"emailAddress": "david.blunden@intel.com"
}
]
Hi @Raj_Sekhar,
I understood your point. For this please try with below line adding in filter section.
ruby {code => 'open("/tmp/test.json", "w") { |file| file.write(event.get("json").to_json) }' }
Please do let me know if still you are not able achieve the same.
Regards,
Harsh Bajaj
Hi @harshbajaj16, Tried it, but the same result.
May I know where is the file writing in your command ?
Hi @Raj_Sekhar,
You need to add this in your logstash configuration file which is in /conf.d/ directory.
There are three section in conf file Input, Filter and Output. You need to add this in filter section.
Please find document link for filter ruby plugin.
Also, i found a discussion for similar problem. You can look into this and can get more idea about filter plugin.
File in the command is /tmp/test.json.
Please do let me know if you need more help in this regard.
Regards,
Harsh Bajaj
Hi @harshbajaj16,
Below is my conf file after adding what you have suggested to add.
input {
stdin {
codec => "json"
}
}
filter {
if "_jsonparsefailure" in [tags] {drop { }} # for last record which comes with text->Impala query scan limit reached
ruby {code => 'open("/tmp/abc.json", "w") { |file| file.write(event.get("json").to_json) }' }
}
mutate {
rename => {"[priority][name]" => "priority_name"}
rename => {"[priority][id]" => "priority_id"}
rename => {"[priority][self]" => "priority_self"}
}
output {
stdout {codec => rubydebug { metadata => true }}
stdout { codec => dots }
file {
codec => "json"
path => ["/tmp/logstash_output/test/%{key}.json"]
write_behavior => "overwrite"
}
csv {fields => ["priority_name","priority_id","priority_self"]
path => ["/tmp/output_csv/%{key}.csv"]
write_behavior => "overwrite"
}
}
rubydebug always displays data using => but that should not affect the format written to your file or csv output.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.