How to sync custom template with Incoming data from Filebeat?


(sukesh) #1

Hi , I am trying to index json data to elasticsearch from Filebeat.

the data shipped by filebeat is as follows :

{"first name":"abc","last name":"efg","age":26,"city":"newyork","country":"USA","zipcode":"10001"}
{"first name":"xyz","last name":"lmn","age":28,"city":"herndon","country":"USA","zipcode":"20170"}
{"first name":"abc","last name":"pqr","age":27,"city":"chantilly","country":"USA","zipcode":"20152"}

filebeat.yml contains below config :

filebeat.prospectors:

  • type: log
    enabled: true
    paths:
    • /home/sukesh/Downloads/json/*.json
      exclude_files: ['.csv$','.xml$','.txt$','.gz$']

#Elasticsearch template setting
setup.template.name: "Tname"
setup.template.pattern: "Tname-*"
#setup.template.fields: "/home/sukesh/Desktop/fields.yml"
#setup.template.overwrite: true
setup.template.settings:
index.number_of_shards: 2
#index.codec: best_compression
#_source.enabled: true

#General
tags: ["pattern1_json"]

#Kibana
setup.kibana:
host: "localhost:5601"
username: "elastic"
password: "sukesh"

#Outputs
output.elasticsearch:
hosts: ["localhost:9301"]
index: "Iname"
username: "elastic"
password: "sukesh"

I created template in elasticsearch as follows :

PUT _template/Tname
{
"index_patterns" : ["Iname*"],
"settings" : {
"number_of_shards" : 1
},
"mappings":{
"Idoc_type":{
"properties":{
"first name":{"type":"text"},
"last name":{"type":"text"},
"age":{"type":"integer"},
"city":{"type":"text"},
"country":{"type":"text"},
"zipcode":{"type":"text"}
}
}
}
}

I did a config test and output check and its fine.

i checked data in kibana but entire json is poping under message field:

"message" :{"first name":"abc","last name":"efg","age":26,"city":"newyork","country":"USA","zipcode":"1001"}

This means my template is not working. i need all the fields of json in kibana not under message

Pls guide me how can i sync my template with incoming data. do i need to use fields.yml?

Filebeat -> elasticsearch -> kibana


(Pier-Hugues Pellerin) #2

Hello @sukesh_nagaraja,

You can control the behavior of the JSON parser in Filebeat at the prospector level.

The option you want is json.keys_under_root: true

keys_under_root
By default, the decoded JSON is placed under a "json" key in the output document. If you enable this setting, the keys are copied top level in the output document. The default is false.


(sukesh) #3

Thanks for the response. I tried below it works.

      filebeat.prospectors:
      - type: log
      enabled: true
      paths:
      - /home/sukesh/Downloads/json/*.json  

     json.keys_under_root: true
     json.add_error_key: true

     setup.kibana:
     host: "localhost:5601"
     username: "elastic"
     password: "sukesh"

     output.elasticsearch:
     hosts: ["localhost:9301"]
     index: "freddie_test"
     username: "elastic"
     password: "sukesh"

the above config works for very simple json with no nesting data and multiline

to handle multiline : i have below config but all content sits under one field called "log".

sample data:
{"nesting data":[{"category":"Audi","region":"East","monthlySales":[{"month":20130101,"sales":38},{"month":20130201,"sales":35},{"month":20130301,"sales":41},{"month":20130401,"sales":55},{"month":20130501,"sales":58},{"month":20130601,"sales":66},{"month":20130701,"sales":74},{"month":20130801,"sales":78},{"month":20130901,"sales":38},{"month":20131001,"sales":30},{"month":20131101,"sales":26},{"month":20131201,"sales":29}]},{"category":"Technology","region":"West","monthlySales":[{"month":20130101,"sales":54},{"month":20130201,"sales":66},{"month":20130301,"sales":77},{"month":20130401,"sales":70},{"month":20130501,"sales":60},{"month":20130601,"sales":63},{"month":20130701,"sales":55},{"month":20130801,"sales":47},{"month":20130901,"sales":55},{"month":20131001,"sales":30},{"month":20131101,"sales":22},{"month":20131201,"sales":77}]}]}

config details:

      filebeat.prospectors:
     - type: log
     enabled: true
     paths:
         - /home/sukesh/Downloads/json/*.json

     json.message_key: log
     json.keys_under_root: true
     json.add_error_key: true

   multiline.pattern: '^{'
   multiline.negate: true
   multiline.match:  after
   processors:
   - decode_json_fields:
   fields: ['message']
   target: json  

   setup.kibana:
   host: "localhost:5601"
   username: "elastic"
   password: "sukesh"

   output.elasticsearch:
   hosts: ["localhost:9301"]
   index: "freddie_test"
   username: "elastic"
   password: "sukesh"

How can i get each key values individually rather than wrapped up under one field?


(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.