adxalex
(Adxalex)
May 6, 2022, 4:46pm
1
Hi comunity,
I'm trying to remove some fileds into a json. The json is:
{"recode":"VZ##","response-code":"4000","response":{"result":[{"DetailsPageURL":"idsource":"0""Attribute":[{"DISPLAYNAME":"Tiempo de respuesta server oasu","Value":"2305"}]}]}}
I just need the fields: DISPLAYNAME and Value. I tried with to diferents configuration files:
input {
file {
path => "/mnt/f/user/input/archivo.log"
type => "json"
codec=> "json"
}
}
filter {
prune
{
whitelist_names => ["Attribute"]
}
}
output {
stdout {
codec => rubydebug
}
}
i recieve just {} in blank
input {
file {
path => "/mnt/f/user/input/archivo.log"
type => "json"
codec=> "json"
}
}
filter {
json {
source => "result"
remove_field => [ "foo_%{DetailsPageURL}", "idsource"]
}}
output {
stdout {
codec => rubydebug
}
}
and this config dont delete anything.
Thanks for your helps.
As the documentation says, prune only operates on top-level fields, so you cannot use it on fields inside [response][result].
Your message is not valid JSON. If I change it to be valid then I can do this:
input { generator { count => 1 lines => [ '{"recode":"VZ##","response-code":"4000","response":{"result":[{"DetailsPageURL":"foo","idsource":"0","Attribute":[{"DISPLAYNAME":"Tiempo de respuesta server oasu","Value":"2305"}]}]}}' ] } }
filter {
json { source => "message" remove_field => [ "message" ] }
mutate {
add_field => {
"DISPLAYNAME" => "%{[response][result][0][Attribute][0][DISPLAYNAME]}"
"Value" => "%{[response][result][0][Attribute][0][Value]}"
}
remove_field => [ "response" ]
}
}
output { stdout { codec => rubydebug { metadata => false } } }
which produces
"response-code" => "4000",
"Value" => "2305",
"recode" => "VZ##",
"DISPLAYNAME" => "Tiempo de respuesta server oasu"
You may want to replace the remove_field with a prune using whitelist_names.
If you want to keep those two fields in place then you would need a custom ruby filter. I do not think there is a generic solution to do it that way.
1 Like
adxalex
(Adxalex)
May 6, 2022, 8:11pm
3
Great @Badger , i apreciate your answer, thats works perfect.
Just one aditional questions:
I want to export tu csv:
output {
stdout {
codec => rubydebug
}
csv {
csv_options => {"headers" => true}
fields => ["LASTPOLLEDTIME", "[DISPLAYNAME]","[Value]"]
fields => ["LASTPOLLEDTIME", "[DISPLAYNAME2]","[Value2]"]
path => "/mnt/f/user/output/archivo.csv"
}
}
I would like from once call from json, be able to insert the fields in diferents rows, i tried adding "fields" two times, but doesn't work he still add in the same raw.
How can insert into csv file diferentes fields in diferents rows?
Thanks
adxalex:
fields => ["LASTPOLLEDTIME", "[DISPLAYNAME]","[Value]"]
fields => ["LASTPOLLEDTIME", "[DISPLAYNAME2]","[Value2]"]
When set an option that expects a hash or array more than once logstash will merge them into a single hash or array.
A csv output will always produce one line per event. The only way I can think of to get multiple lines is something like
input { generator { count => 1 lines => [ '{ "a": 1, "b":2, "c":3, "d":4 }' ] codec => json } }
filter {}
output { stdout { codec => line { format => "%{a},%{b}
%{c},%{d}" } } }
which produces
1,2
3,4
Obviously you would use a file output, not a stdout output.
adxalex
(Adxalex)
May 9, 2022, 1:39pm
5
Hi @Badger , thanks for your help,
I could take another path and this works:
output {
stdout {
codec => rubydebug
}
csv {
csv_options => {"headers" => true}
fields => ["LASTPOLLEDTIME", "[DISPLAYNAME]","[Value]"]
path => "/mnt/f/user/output/archivo.csv"
}
csv {
csv_options => {"headers" => true}
fields => ["LASTPOLLEDTIME", "[DISPLAYNAME2]","[Value2]"]
path => "/mnt/f/user/output/archivo.csv"
}
}
Thanks for your help and let me know if you think that this path it's ok.
I do not know if that will work. Whether two outputs writing to the same file will respect oneanother's positions in the file is really dependent on the operating system and filesystem. I would not expect them to maintain order, either one could write a batch of events first.
1 Like
system
(system)
Closed
June 6, 2022, 1:45pm
7
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.