How to parse JSON with nested field name "source"

Hi, I am new to ELK. I am trying to parse JSON strings that contain field name "source".
Here is an example:
{"feed": {"name": "AlienVault", "accuracy": 100.0, "url": ""}, "classification": {"type": "spam"}, "time": {"observation": "2016-05-15T16:57:03+00:00"}, "source": {"geolocation": {"latitude": 51.5332984924, "longitude": 0.699999988079, "cc": "GB", "city": "Southend"}, "ip": ""}, "raw": "MTYzLjE3Mi4xOTguMjI3IzYjMiNTcGFtbWluZyNHQiNTb3V0aGVuZCM1MS41MzMyOTg0OTI0LDAuNjk5OTk5OTg4MDc5IzEy"}

In ElasticSearch /Kibana I have a field named "source" but it contains the path to the txt file where logs are stored. The output is:

"_source": {
"raw": "MjIyLjEzNi43MS4xOSMzIzIjU2Nhbm5pbmcgSG9zdCNDTiNaaGVuZ3pob3UjMzQuNjgzNjAxMzc5NCwxMTMuNTMyNTAxMjIxIzEx",
"source": "/opt/intelmq/var/lib/bots/file-output/events.txt",
"classification": {
"type": "scanner"

Somehow in ES I am missing these fields and their values:

    "geolocation": {
        "latitude": 51.5332984924,
        "longitude": 0.699999988079,
        "cc": "GB",
        "city": "Southend"
    "ip": ""


1 Like

How are you inserting the data into ES?

Hi Mark,
I use filebeat to send the data to logstash and from there to ES.
Is it possible that filebeat is modifying the data before it reaches Logstash, because the value for the "source" field is the configuration setting in filebeat.yml file for filebeat -> prospectors->paths.
If this is the case is there a way to stop filebeat from modifying the input to logstash?