Logstash is unable to process json file

Hi All,

We are trying to load a json file into elasticsearch using logstash, its just a direct upload but which is not happening neither i get errors also, seems like it got stuck somewhere i'm not able to break it down, please help

Here is my config file

input {
  file {
    codec => "json"
    path => "/opt/installables/Output.json"
    start_position => "beginning"
    sincedb_path => "/dev/null"
filter {
   json {
    source => "message"
output {
#    elasticsearch {
#    hosts => ""
#    user => "*****"
#    password => "*****"
#    index => "jsonlogs"
#  }
  stdout {  codec => rubydebug  }

Here is my sample output of the json file

   [{"Device_ID": "/api/device/17", "Device_Name": "device73", "Collection_Time": "01-05-2019 13:15:00", "unix_cpu_utilization": "0.967", "unix_mem_utilization":
     "91.26"}, {"Device_ID": "/api/device/17", "Device_Name": "device73", "Collection_Time": "01-05-2019 13:20:00", "unix_cpu_utilization": "0.967", "unix_mem_uti
    lization": "91.254"}, {"Device_ID": "/api/device/17", "Device_Name": "device73", "Collection_Time": "01-05-2019 13:25:00", "unix_cpu_utilization": "0.967", "u
    nix_mem_utilization": "91.261"}, {"Device_ID": "/api/device/17", "Device_Name": "device73", "Collection_Time": "01-05-2019 13:30:00", "unix_cpu_utilization":
    "0.967", "unix_mem_utilization": "91.302"}, {"Device_ID": "/api/device/17", "Device_Name": "device73", "Collection_Time": "01-05-2019 13:35:00", "unix_cpu_uti
    lization": "0.967", "unix_mem_utilization": "91.265"}, {"Device_ID": "/api/device/17", "Device_Name": "device73", "Collection_Time": "01-05-2019 13:40:00", "u
    nix_cpu_utilization": "0.967", "unix_mem_utilization": "91.262"}, {"Device_ID": "/api/device/17", "Device_Name": "device73", "Collection_Time": "01-05-2019 13
    :45:00", "unix_cpu_utilization": "0.967", "unix_mem_utilization": "91.265"}, {"Device_ID": "/api/device/17", "Device_Name": "device73", "Collection_Time": "01
    -05-2019 13:50:00", "unix_cpu_utilization": "0.967", "unix_mem_utilization": "91.261"}, {"Device_ID": "/api/device/17", "Device_Name": "device73", "Collection
    _Time": "01-05-2019 13:55:00", "unix_cpu_utilization": "0.967", "unix_mem_utilization": "91.265"}, {"Device_ID": "/api/device/17", "Device_Name": "device73",
    "Collection_Time": "01-05-2019 14:00:00", "unix_cpu_utilization": "0.967", "unix_mem_utilization": "91.265"}, {"Device_ID": "/api/device/17", "Device_Name": "device73", "Collection_Time": "01-05-2019 14:05:00", "unix_cpu_utilization": "0.967", "unix_mem_utilization": "91.265"}, {"Device_ID": "/api/device/17", "Device_Name":
    "device73", "Collection_Time": "01-05-2019 14:10:00", "unix_cpu_utilization": "0.967", "unix_mem_utilization": "91.264"}, {"Device_ID": "/api/device/17", "Dev
    ice_Name": "III-PROD-JB-129-72", "Collection_Time": "01-05-2019 14:15:00", "unix_cpu_utilization": "0.966", "unix_mem_utilization": "91.286"}, {"Device_ID": "/api/devic
    e/17", "Device_Name": "device73", "Collection_Time": "01-05-2019 14:20:00", "unix_cpu_utilization": "0.966", "unix_mem_utilization": "91.284"}, {"Device_ID":
    "/api/device/17", "Device_Name": "device73", "Collection_Time": "01-05-2019 14:25:00", "unix_cpu_utilization": "0.966", "unix_mem_utilization": "91.284"}, {"D
    evice_ID": "/api/device/17", "Device_Name": "device73", "Collection_Time": "01-05-2019 14:30:00", "unix_cpu_utilization": "0.966", "unix_mem_utilization": "91
    .326"}, {"Device_ID": "/api/device/17", "Device_Name": "device73", "Collection_Time": "01-05-2019 14:35:00", "unix_cpu_utilization": "0.966", "unix_mem_utiliz
    ation": "91.284"}, {"Device_ID": "/api/device/17", "Device_Name": "device73", "Collection_Time": "01-05-2019 14:40:00", "unix_cpu_utilization": "0.966", "unix
    _mem_utilization": "91.287"}]

Please advice what am i missing here,


If you check your logs you will see the error message "Parsed JSON object/hash requires a target configuration option". It cannot guess a field name to parse the array into. You have to set the target option on the json filter.

@Badger Have verified the logs, there are no errors as such, here are the log file entry

[2019-05-03T15:50:28,492][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.5.3"}
[2019-05-03T15:50:33,501][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-05-03T15:50:33,882][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x622c49e0 run>"}
[2019-05-03T15:50:33,972][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-05-03T15:50:34,124][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2019-05-03T15:50:34,621][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Have set the target option as well, but still no progress, it gets stuck at the same place.


@Badger i think i found the issue, there is a field in the json which is "device name" and the name of the device is mentioned with fqdn (like "device.test.com") , due to that "." its not processing the document.

If i replace the "." manually to "-" (like "device-test-com") then data upload is happening.

Not sure how to ignore this "." and proceed, please advice.


I cannot think of an explanation of why that would happen.

@Badger Is there any way where we can ignore special characters and ingest data using logstash?

I do not understand why that would cause an issue, so I cannot suggest a way to fix it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.