Json filter is not working

hi,
i'm trying to parse json file as simple as possible. field:value without touching the json order.

when i run log stash, it starts , but nothing happened...

this is the log stash console while running, just sits there waiting for something.
c:\Install\6.2.2\logstash-6.2.2\bin>logstash -f logstashPipeLine_RAWjson.conf
Sending Logstash's logs to c:/Install/6.2.2/logstash-6.2.2/logs which is now configured via log4j2.properties
[2018-03-18T10:56:22,900][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"c:/Install/6.2.2/logstash-6.2.2/modules/fb_apache/configuration"}
[2018-03-18T10:56:22,931][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"c:/Install/6.2.2/logstash-6.2.2/modules/netflow/configuration"}
[2018-03-18T10:56:23,197][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-03-18T10:56:24,322][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.2"}
[2018-03-18T10:56:25,212][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-03-18T10:56:30,307][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-03-18T10:56:31,088][INFO ][logstash.pipeline ] Pipeline started succesfully {:pipeline_id=>"main", :thread=>"#<Thread:0x6b5fa22f run>"}
[2018-03-18T10:56:31,212][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}

this is my logstash conf file:

input {  
  file
  {
	type => "json"
	path => "C:/jsonExample.json"
  }
	  }

filter
{
   json {
			source => "message"
			target => "myroot"
		}
}

output {
  stdout {
    codec => rubydebug
  }
}

this is the json file content:

{"results":[{"NodeID":1,"IPAddress":"126.0.3.5","Caption":"Shob-Bsm-Probe"},{"NodeID":2,"IPAddress":"126.0.3.13","Caption":"NPIA1D095"},{"NodeID":3,"IPAddress":"126.0.3.108","Caption":"HP V1910 Switch"},{"NodeID":4,"IPAddress":"126.0.3.41","Caption":"MOSHEZ"},{"NodeID":5,"IPAddress":"126.0.3.2","Caption":"SHOB-CMDB"},{"NodeID":6,"IPAddress":"126.0.3.30","Caption":"MIRIAM"},{"NodeID":7,"IPAddress":"126.0.3.55","Caption":"MAZIB"},{"NodeID":8,"IPAddress":"126.0.3.73","Caption":"CTX3"},{"NodeID":9,"IPAddress":"126.0.3.84","Caption":"AMANUI6"},{"NodeID":10,"IPAddress":"126.0.3.104","Caption":"MEITALR"},{"NodeID":11,"IPAddress":"126.0.3.125","Caption":"CITRIXMASTER"},{"NodeID":12,"IPAddress":"126.0.3.96","Caption":"KMSTESTINGSRV"},{"NodeID":13,"IPAddress":"126.0.3.4","Caption":"SHOBDB1"},{"NodeID":14,"IPAddress":"126.0.3.12","Caption":"IRISBD"},{"NodeID":15,"IPAddress":"126.0.3.34","Caption":"RITASA"},{"NodeID":16,"IPAddress":"126.0.3.100","Caption":"DANADA"},{"NodeID":17,"IPAddress":"126.0.3.124","Caption":"LIATS-PC"},{"NodeID":18,"IPAddress":"126.0.3.3","Caption":"SHOB-DB"},{"NodeID":19,"IPAddress":"126.0.3.29","Caption":"CTX5"},{"NodeID":20,"IPAddress":"126.0.3.87","Caption":"CTX4"},{"NodeID":21,"IPAddress":"126.0.3.1","Caption":"SHOB-BSM"},{"NodeID":22,"IPAddress":"126.0.3.65","Caption":"MAZIB"},{"NodeID":23,"IPAddress":"126.0.3.129","Caption":"SHUKI"},{"NodeID":24,"IPAddress":"126.0.3.193","Caption":"ESTERD"},{"NodeID":25,"IPAddress":"126.0.3.211","Caption":"IN4MATICA01"},{"NodeID":26,"IPAddress":"126.0.0.13","Caption":"AMANMAIL"},{"NodeID":27,"IPAddress":"126.0.0.138","Caption":"AMANCMC"},{"NodeID":28,"IPAddress":"126.0.0.42","Caption":"CITRIX-XDC"},{"NodeID":29,"IPAddress":"126.0.0.180","Caption":"CITRIX-XDC2"},{"NodeID":30,"IPAddress":"126.0.0.62","Caption":"DEVINFO"},{"NodeID":31,"IPAddress":"126.0.0.3","Caption":"AMANNNM"},{"NodeID":32,"IPAddress":"126.0.0.74","Caption":"TS-SEC"},{"NodeID":33,"IPAddress":"126.0.0.101","Caption":"PRIORITYDEV"},{"NodeID":34,"IPAddress":"126.0.1.36","Caption":"DEVTFSSEC"},{"NodeID":35,"IPAddress":"126.0.0.173","Caption":"PRIORITYTEST"},{"NodeID":36,"IPAddress":"126.0.0.125","Caption":"PRIORITYPROD"},{"NodeID":37,"IPAddress":"126.0.0.157","Caption":"AMANID"},{"NodeID":38,"IPAddress":"126.0.0.181","Caption":"AMANMB02"},{"NodeID":39,"IPAddress":"126.0.0.227","Caption":"AMANMB03"},{"NodeID":40,"IPAddress":"126.0.0.129","Caption":"IN4MATICAVMPOC3"},{"NodeID":41,"IPAddress":"126.0.0.116","Caption":"AMANMB04"},{"NodeID":42,"IPAddress":"126.0.0.2","Caption":"FORMIT8"},{"NodeID":43,"IPAddress":"126.0.0.177","Caption":"AMANPRTG"},{"NodeID":44,"IPAddress":"126.0.0.199","Caption":"IN4MATICA02"},{"NodeID":45,"IPAddress":"126.0.0.198","Caption":"IN4MATICA03"},{"NodeID":46,"IPAddress":"126.0.0.197","Caption":"AMANEX1"},{"NodeID":47,"IPAddress":"126.0.0.195","Caption":"AMANEX2"},{"NodeID":76,"IPAddress":"192.168.10.247","Caption":"192.168.10.247"},{"NodeID":77,"IPAddress":"192.168.10.248","Caption":"192.168.10.248"},{"NodeID":78,"IPAddress":"192.168.10.249","Caption":"192.168.10.249"},{"NodeID":79,"IPAddress":"192.168.10.250","Caption":"192.168.10.250"}]}

any ideas ?

thanks
david

@David_Gidony A couple of things:

  1. The file input needs two additional parameters: sincedb_path and start_position.
  2. Since json file is an array, you might need to use split filter to generate an event for each array element in the result array.

Something similar to the below config might do the trick:

input {  
	file {
		type => "json"
		path => "C:/jsonExample.json"
		sincedb_path => "c:/sincedb"
		start_position => "beginning"
	}
}

filter
{
	split { field => "[results]" }
}

output {
	stdout {
		codec => rubydebug
	}
}

Is what you've posted on a single line in the file? Or does the file look exactly like what you posted? In the latter case you need to use a multiline codec to join the lines in the file into a single Logstash event that you can process with the json filter.

hi, i opened the file with notepad and copy/paste its content...
can you please post an example ?

thanks

not working , still.
the pipe line is starting but no data is being processed :frowning:

any other ideas >

Examples of using the multiline codec to join all lines of a file (not necessarily a JSON file!) into a single event has been posted in the past. I don't have time to dig it up.

managed to make it work with this:

input {
	file{
		path => ["/home/dudug/Desktop/output_CPU.json"]
		codec => "json"
		start_position => "beginning"
		sincedb_path => "/dev/null"
	}
}

filter {

	mutate {
		gsub => [ "message", "\},\{", "\r\n" ]
	 }
	 
	json { 
		source => "message"	
	 }

	date {
		match => ["DateTime", "ISO8601"]
		}
}

output {
  stdout {
    codec => rubydebug 
  }

}

thanks for your help guyz.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.