Define upload fields from json file

Hi ,

In the conf file , how can I define which fields I want to upload to the ElasticSearch ?
My Json file contain one row that need to be uploaded to the ElasticSearch , So after the upload I would like to see I row under the index with the fields and values compatibility.

For example , this is my conf file:
input {
file{
path => ["/tmp/y4.json"]
type => "json"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter{
grok {
match => [ 'message', '(?"TestName":.*"Agent":"[^"]+")' ]
}
json {
source => "message"
}
}

output {
stdout {
codec => rubydebug
}
elasticsearch {
host => "XX.XX.XX.XXX"
protocol => "http"
index => "index_junit"
}
}

I want the following fields from the Json file:
queueId , timestamp , startTime , result , duration , charset

BR,
Chen

In the conf file , how can I define which fields I want to upload to the Elasticsearch ?

Elasticsearch gets the whole event (except the @metadata field). If you don't want to include particular fields you have to delete them with e.g. the mutate or the prune filter.

Hi ,

When I'm uploading a flat Json file (All tags in level 1) I don't have a problem.
When I'm uploading a complex Json (Tags with more then 1 level) , I have a problem. I don't have the fields I need , only message and in some cases the value of message is "{" or "}",

BR,
Chen

Please show an example so that it's possible to understand what's going on.

Coming back to your original post,

match => [ 'message', '(?"TestName":.*"Agent":"[^"]+")' ]

why are you using grok to parse JSON? You're even already using a json filter for that.

In Stack Overflow


It's written that it won't work well with nested JSON structs and I need a simple hash of key/value pairs.
Is there a way to upload nested JSON structs ?

Kibana can't deal with arrays of objects, but objects containing other objects are fine.

Thanks :slight_smile: