Log stack freezing After starting API end point

Hi Guys
I am new to Logstash.
I am trying to write small json file to Logstash. It is reading my config etc and freezing after Starting API End Point. No errors showing.
Below is my config file;
input {
file {
path => "C:\Development\LogStashData\Products.json"
codec => "json"
start_position => "beginning"

}

}

output {
elasticsearch {
hosts => "localhost:9200"
index => "products"
}
stdout { codec => rubydebug}
}

below is the output screen:
PS C:\Development\logstash-7.9.3> .\bin\logstash -f "C:\Development\LogStashData\logstash-simple.conf"
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.ext.openssl.SecurityHelper (file:/C:/Users/sdont/AppData/Local/Temp/jruby-25628/jruby17635428283013828612jopenssl.jar) to field java.security.MessageDigest.provider
WARNING: Please consider reporting this to the maintainers of org.jruby.ext.openssl.SecurityHelper
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Sending Logstash logs to C:/Development/logstash-7.9.3/logs which is now configured via log4j2.properties
2020-10-28 22:52:48,440 main ERROR No ScriptEngine found for language JavaScript. Available languages are: ruby, jruby
2020-10-28 22:52:48,462 main ERROR No ScriptEngine found for language JavaScript. Available languages are: ruby, jruby
2020-10-28 22:52:48,627 main ERROR No ScriptEngine found for language JavaScript. Available languages are: ruby, jruby
[2020-10-28T22:52:48,662][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.9.3", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc Java HotSpot(TM) 64-Bit Server VM 15.0.1+9-18 on 15.0.1+9-18 +indy +jit [mswin32-x86_64]"}
[2020-10-28T22:52:48,814][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-10-28T22:52:50,259][INFO ][org.reflections.Reflections] Reflections took 39 ms to scan 1 urls, producing 22 keys and 45 values
[2020-10-28T22:52:52,008][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2020-10-28T22:52:52,173][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-10-28T22:52:52,220][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-10-28T22:52:52,224][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2020-10-28T22:52:52,287][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2020-10-28T22:52:52,345][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2020-10-28T22:52:52,377][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["C:/Development/LogStashData/logstash-simple.conf"], :thread=>"#<Thread:0xab8ed61 run>"}
[2020-10-28T22:52:52,423][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-10-28T22:52:53,226][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>0.85}
[2020-10-28T22:52:53,894][INFO ][logstash.inputs.file ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/Development/logstash-7.9.3/data/plugins/inputs/file/.sincedb_411b5abf218957798fe4dde659f4ed0a", :path=>["C:\Development\LogStashData\Products.json"]}
[2020-10-28T22:52:53,920][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-10-28T22:52:53,967][INFO ][filewatch.observingtail ][main][c870fdd724f25b417c2c739307e35cee2c1b2828010362ee2b30ec177127786c] START, creating Discoverer, Watch with file and sincedb collections
[2020-10-28T22:52:53,978][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2020-10-28T22:52:54,430][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

after this i can't see any progress..no error, no logs etc..
Thank you in advance.

Do not use backslash in the path option of a file input, it is treated as an escape. Change them to forward slash.

If that does not help then set log.level to trace and see what messages the filewatch code logs.

@Badger Thank you. Changing the to fwd slash worked.
Now I have another issue. I have a simple json
with following data in json file:

{
		"productidentifiers": [
			"9780799381832"
		],
		"cop": "South Africa",
		"title": "Sacrificed"
	}

Now the problem is,

  1. it is writing separate entry in Elastic search for each Json Property, it recorded 7 document for the above json file, for which i am expecting one document.

  2. it is not wring any logs in (as per configuration) : C:/Development/logstash-7.9.3/logs

below is my config file

input {
  file {
    type => "json"
    path => "C:/Development/LogStashData/sample.json"
    start_position => "beginning"  
}
}

filter {
  json {
    source => "message"	
  }
}

output {
  stdout { codec => rubydebug }
  elasticsearch{
	hosts => "http://localhost:9200"
        index => "testindex"
	}
}

Please advise. Thank you for your time. Below is the on screen message

Error parsing json {:source=>"message", :raw=>"\t\"productidentifiers\": [", :exception=>#<LogStash::Json::ParserError: Unexpected character (':' (code 58)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
 at [Source: (byte[])"  "productidentifiers": ["; line: 1, column: 23]>}
{
          "path" => "C:/Development/LogStashData/sample.json",
      "@version" => "1",
          "tags" => [
        [0] "_jsonparsefailure"
    ],
          "host" => "SDontula-PC",
    "@timestamp" => 2020-10-31T14:06:02.459Z,
       "message" => "\t}{",
          "type" => "json"
}
{
          "path" => "C:/Development/LogStashData/sample.json",
      "@version" => "1",
          "tags" => [
        [0] "_jsonparsefailure"
    ],
          "host" => "SDontula-PC",
    "@timestamp" => 2020-10-31T14:06:02.463Z,
       "message" => "\t],",
          "type" => "json"
}
{
          "path" => "C:/Development/LogStashData/sample.json",
      "@version" => "1",
          "tags" => [
        [0] "_jsonparsefailure"
    ],
          "host" => "SDontula-PC",
    "@timestamp" => 2020-10-31T14:06:02.464Z,
       "message" => "\t\"title\": \"Sacrificed\"",
          "type" => "json"
}
{
          "path" => "C:/Development/LogStashData/sample.json",
      "@version" => "1",
          "tags" => [
        [0] "_jsonparsefailure"
    ],
          "host" => "SDontula-PC",
    "@timestamp" => 2020-10-31T14:06:02.463Z,
       "message" => "\t\t\"9780799381832\"",
          "type" => "json"
}
{
          "path" => "C:/Development/LogStashData/sample.json",
      "@version" => "1",
          "tags" => [
        [0] "_jsonparsefailure"
    ],
          "host" => "SDontula-PC",
    "@timestamp" => 2020-10-31T14:06:02.462Z,
       "message" => "\t\"productidentifiers\": [",
          "type" => "json"
}

If your JSON files are pretty-printed then you will need to use a multiline codec on the file input to combine all the lines of a single JSON object into one event.

If you want to read the entire file as a single event you can use this configuration.

Hi @Badger
Thank you for helping on this issue so far, I tried adding the multi-codec, but unfortunately still not writing as one document. Please review and suggest what I am doing wrong.
My config looks like below now :

input {
  file {
    type => "json"
    path => "C:/Development/LogStashData/sample.json"
    start_position => "beginning" 
	codec => multiline{
	pattern =>"json"
	what => "previous" 
	}
}
}

filter {
  json {
    source => "message"	
	tag_on_failure => [ "_jsonparsefailure" ]
  }
}

output {
  stdout { codec => rubydebug }
  elasticsearch{
	hosts => "http://localhost:9200"
    index => "testindex"
	}
} 

my input json file is

[
  {
    "sku": "9781984806338",
    "name": "THE CHOSEN ONE",
    "price": 290.00,
    "publisher": "PENGUIN ADULT"
  },
  {
   "sku": "9781775847410",
    "name": "FIRST FIELD GUIDE TO MUSHROOMS OF SOUTHERN AFRICA",
    "price": 80.00,
    "publisher": "STRUIK"
  },
  {
    
   "sku": "9781784701994",
    "name": "WHEN BREATH BECOMES AIR",
    "price": 290.00,
    "publisher": "VINTAGE"
  },
  {
   "sku": "9780755322824",
    "name": "STARDUST",
    "price": 182.75,
    "publisher": "HEADLINE"
  }
]

Set negate => true on the codec and add a target option to the json filter (the input is an array, so the target option is required).

Hi @Badger
Thank you for your reply.
I tried with suggested options, unfortunately it is freezing, not writing any logs or no error display on the console. Kindly advise if I am doing wrong.

input {
  file {
    type => "json"
    path => "C:/Development/LogStashData/sample.json"
    start_position => "beginning" 
	codec => multiline{
	pattern =>"json"
	what => "previous" 
	negate => true
	}
}
}

filter {
  json {
    source => "message"	
	tag_on_failure => [ "_jsonparsefailure" ]
	target => "parsedJson"
  }
}

output {
  stdout { codec => rubydebug }
  elasticsearch{
	hosts => "http://localhost:9200"
    index => "testindex"
	}
}

following is the console output.

WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.ext.openssl.SecurityHelper (file:/C:/Users/sdont/AppData/Local/Temp/jruby-3444/jruby12365318988486887759jopenssl.jar) to field java.security.MessageDigest.provider
WARNING: Please consider reporting this to the maintainers of org.jruby.ext.openssl.SecurityHelper
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Sending Logstash logs to C:/Development/logstash-7.9.3/logs which is now configured via log4j2.properties
2020-11-01 19:40:41,105 main ERROR No ScriptEngine found for language JavaScript. Available languages are: ruby, jruby
2020-11-01 19:40:41,121 main ERROR No ScriptEngine found for language JavaScript. Available languages are: ruby, jruby
2020-11-01 19:40:41,276 main ERROR No ScriptEngine found for language JavaScript. Available languages are: ruby, jruby
[2020-11-01T19:40:41,310][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.9.3", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc Java HotSpot(TM) 64-Bit Server VM 15.0.1+9-18 on 15.0.1+9-18 +indy +jit [mswin32-x86_64]"}
[2020-11-01T19:40:41,465][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-11-01T19:40:43,028][INFO ][org.reflections.Reflections] Reflections took 47 ms to scan 1 urls, producing 22 keys and 45 values
[2020-11-01T19:40:45,444][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-11-01T19:40:45,617][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-11-01T19:40:45,677][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-11-01T19:40:45,677][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-11-01T19:40:45,788][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2020-11-01T19:40:45,851][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2020-11-01T19:40:45,901][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["C:/Development/logstash-7.9.3/config/test.conf"], :thread=>"#<Thread:0x95829fa run>"}
[2020-11-01T19:40:45,933][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-11-01T19:40:46,903][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.0}
[2020-11-01T19:40:47,688][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/Development/logstash-7.9.3/data/plugins/inputs/file/.sincedb_46a48097852ad9cac593d40f2d9e34d5", :path=>["C:/Development/LogStashData/sample.json"]}
[2020-11-01T19:40:47,717][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-11-01T19:40:47,779][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-11-01T19:40:47,786][INFO ][filewatch.observingtail  ][main][a2f899cb7c7b6cf8c203b81746c56a97db85c39cea469619164a5538f9f0044e] START, creating Discoverer, Watch with file and sincedb collections
[2020-11-01T19:40:48,279][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Thank you

Add auto_flush_interval => 1 to the multiline codec options.

Hi @Badger
I have added but it is freezing , am I missing anything? below is my config.

input {
  file {
    type => "json"
    path => "C:/Development/LogStashData/sample.json"
    start_position => "beginning" 
	codec => multiline{
	pattern =>"json"
	what => "previous" 
	negate => true
	auto_flush_interval =>1
	}
}
}

filter {
  json {
    source => "message"	
	tag_on_failure => [ "_jsonparsefailure" ]
	target => "parsedJson"
  }
}

output {
  stdout { codec => rubydebug }
  elasticsearch{
	hosts => "http://localhost:9200"
    index => "testindex"
	}
}

As I said before... set log.level to trace and see what messages the filewatch code logs.

Hi @Badger, thank you
I did set to trace.. but I can't find any valid info, please see if you can spot any issue.
Just to mention : My environment is Windows 10, 64 Bit.

[2020-11-02T19:49:12,885][TRACE][logstash.codecs.multiline] Registered multiline plugin {:type=>nil, :config=>{"pattern"=>"json", "what"=>"previous", "id"=>"f975f0bd-ed59-4325-bc28-8cd2c68a0417", "auto_flush_interval"=>1, "negate"=>true, "enable_metric"=>true, "patterns_dir"=>[], "charset"=>"UTF-8", "multiline_tag"=>"multiline", "max_lines"=>500, "max_bytes"=>10485760}}
[2020-11-02T19:49:12,938][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@start_position = "beginning"
[2020-11-02T19:49:12,939][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@path = ["C:/Development/LogStashData/sample.json"]
 9:13,611][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "testindex"
[2020-11-02T19:49:13,620][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [http://localhost:9200]

[2020-11-02T19:49:14,487][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-11-02T19:49:14,574][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2020-11-02T19:49:14,642][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2020-11-02T19:49:14,685][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["C:/Development/logstash-7.9.3/config/test.conf"], :thread=>"#<Thread:0xdb26e03 run>"}
[2020-11-02T19:49:14,723][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-11-02T19:49:14,749][DEBUG][logstash.outputs.elasticsearch][main] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2020-11-02T19:49:14,768][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2020-11-02T19:49:14,956][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2020-11-02T19:49:14,961][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
 
[2020-11-02T19:49:15,415][DEBUG][org.logstash.config.ir.CompiledPipeline][main] Compiled filter
 P[filter-json{"source"=>"message", "tag_on_failure"=>["_jsonparsefailure"], "target"=>"parsedJson"}|[file]C:/Development/logstash-7.9.3/config/test.conf:16:3:```
json {
    source => "message"
        tag_on_failure => [ "_jsonparsefailure" ]
        target => "parsedJson"
  }
```]
 into
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@53e5cb12
 
 
[2020-11-02T19:49:15,692][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.0}
[2020-11-02T19:49:16,398][TRACE][logstash.inputs.file     ][main] Registering file input {:path=>["C:/Development/LogStashData/sample.json"]}
[2020-11-02T19:49:16,438][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/Development/logstash-7.9.3/data/plugins/inputs/file/.sincedb_46a48097852ad9cac593d40f2d9e34d5", :path=>["C:/Development/LogStashData/sample.json"]}
[2020-11-02T19:49:16,465][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-11-02T19:49:16,476][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2020-11-02T19:49:16,478][DEBUG][logstash.javapipeline    ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0xdb26e03 run>"}
[2020-11-02T19:49:16,501][TRACE][logstash.agent           ] Converge results {:success=>true, :failed_actions=>[], :successful_actions=>["id: main, action_type: LogStash::PipelineAction::Create"]}
[2020-11-02T19:49:16,522][INFO ][filewatch.observingtail  ][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] START, creating Discoverer, Watch with file and sincedb collections
[2020-11-02T19:49:16,539][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-11-02T19:49:16,557][TRACE][filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] open: reading from C:/Development/logstash-7.9.3/data/plugins/inputs/file/.sincedb_46a48097852ad9cac593d40f2d9e34d5
[2020-11-02T19:49:16,580][DEBUG][logstash.agent           ] Starting puma
[2020-11-02T19:49:16,591][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2020-11-02T19:49:16,595][TRACE][filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] open: importing ... '2330359544-569231-131072 0 0' => '507 1604339208.643 C:/Development/LogStashData/sample.json'
[2020-11-02T19:49:16,604][TRACE][filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] open: setting #<struct FileWatch::InodeStruct inode="2330359544-569231-131072", maj=0, min=0> to #<FileWatch::SincedbValue:0x2f6fa4c9 @last_changed_at=1604339208.643, @path_in_sincedb="C:/Development/LogStashData/sample.json", @watched_file=nil, @position=507>
[2020-11-02T19:49:16,608][TRACE][filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] open: count of keys read: 1
[2020-11-02T19:49:16,648][TRACE][filewatch.discoverer     ][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] discover_files {:count=>1}
[2020-11-02T19:49:16,670][DEBUG][logstash.api.service     ] [api-service] start
[2020-11-02T19:49:16,736][TRACE][filewatch.discoverer     ][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] handling: {:new_discovery=>true, :watched_file=>"<FileWatch::WatchedFile: @filename='sample.json', @state=:watched, @recent_states=[:watched], @bytes_read=0, @bytes_unread=0, current_size=507, last_stat_size=507, file_open?=false, @initial=true, sincedb_key='2330359544-569231-131072 0 0'>"}
[2020-11-02T19:49:16,763][TRACE][filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] associate: finding {:path=>"C:/Development/LogStashData/sample.json", :inode=>"2330359544-569231-131072"}
[2020-11-02T19:49:16,772][TRACE][filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] associate: found sincedb record {:filename=>"sample.json", :sincedb_key=>#<struct FileWatch::InodeStruct inode="2330359544-569231-131072", maj=0, min=0>, :sincedb_value=>#<FileWatch::SincedbValue:0x2f6fa4c9 @last_changed_at=1604339208.643, @path_in_sincedb="C:/Development/LogStashData/sample.json", @watched_file=nil, @position=507>}
[2020-11-02T19:49:16,795][TRACE][filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] handle_association fully read, ignoring..... {:watched_file=>"<FileWatch::WatchedFile: @filename='sample.json', @state=:ignored, @recent_states=[:watched, :watched], @bytes_read=507, @bytes_unread=0, current_size=507, last_stat_size=507, file_open?=false, @initial=false, sincedb_key='2330359544-569231-131072 0 0'>", :sincedb_value=>#<FileWatch::SincedbValue:0x2f6fa4c9 @last_changed_at=1604339356.785, @path_in_sincedb="C:/Development/LogStashData/sample.json", @watched_file=<FileWatch::WatchedFile: @filename='sample.json', @state=:ignored, current_size=507, sincedb_key='2330359544-569231-131072 0 0'>, @position=507>}
[2020-11-02T19:49:16,803][TRACE][filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] associate: inode and path matched
[2020-11-02T19:49:16,837][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_closed
[2020-11-02T19:49:16,846][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_ignored
[2020-11-02T19:49:16,878][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_delayed_delete
[2020-11-02T19:49:16,887][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_restat_for_watched_and_active
[2020-11-02T19:49:16,896][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_rotation_in_progress
[2020-11-02T19:49:16,906][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_watched
[2020-11-02T19:49:16,915][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_active
[2020-11-02T19:49:17,084][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600} 
[2020-11-02T19:49:19,792][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2020-11-02T19:49:19,994][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_closed
[2020-11-02T19:49:19,995][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_ignored
[2020-11-02T19:49:20,001][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_delayed_delete
[2020-11-02T19:49:20,003][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_restat_for_watched_and_active
[2020-11-02T19:49:20,005][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_rotation_in_progress
[2020-11-02T19:49:20,006][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_watched
[2020-11-02T19:49:20,007][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_active
[2020-11-02T19:49:20,030][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2020-11-02T19:49:20,032][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2020-11-02T19:49:21,017][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_closed
[2020-11-02T19:49:21,018][TRACE][filewatch.tailmode.processor][main]
[2020-11-02T19:49:21,021][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_restat_for_watched_and_active
[2020-11-02T19:49:21,024][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_rotation_in_progress
[2020-11-02T19:49:21,025][TRACE][filewatch.tailmode.processor][main]
[2020-11-02T19:49:21,477][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2020-11-02T19:49:22,031][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_closed
[2020-11-02T19:49:22,033][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_ignored

The sincedb contains a record that tells logstash it has already read 507 bytes from the file

[filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] open: importing ... '2330359544-569231-131072 0 0' => '507 1604339208.643 C:/Development/LogStashData/sample.json'

The current size of the file is 507 bytes, so the file input will wait for something to be appended to it before processing any data.

[filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] handle_association fully read, ignoring..... {:watched_file=>"<FileWatch::WatchedFile: @filename='sample.json', @state=:ignored, @recent_states=[:watched, :watched], @bytes_read=507, @bytes_unread=0, current_size=507, last_stat_size=507, file_open?=false, @initial=false, sincedb_key='2330359544-569231-131072 0 0'>", :sincedb_value=>#<FileWatch::SincedbValue:0x2f6fa4c9 @last_changed_at=1604339356.785, @path_in_sincedb="C:/Development/LogStashData/sample.json", @watched_file=<FileWatch::WatchedFile: @filename='sample.json', @state=:ignored, current_size=507, sincedb_key='2330359544-569231-131072 0 0'>, @position=507>}

@Badger
Thank you is 507Kb is too small for logstash to process. is there a minimum size requirement?

@Badger
Apologies, if I remove old sincedb and run the logstash it created a document. Thank you.

However, My input file contains array of four items, but the it wrote one document (first)

I am expecting four documents (,) below is the

[
  {
    "sku": "9781984806338",
    "name": "THE CHOSEN ONE",
    "price": 290.00,
    "publisher": "PENGUIN ADULT"
  },
  {
   "sku": "9781775847410",
    "name": "FIRST FIELD GUIDE TO MUSHROOMS OF SOUTHERN AFRICA",
    "price": 80.00,
    "publisher": "STRUIK"
  },
  {
    
   "sku": "9781784701994",
    "name": "WHEN BREATH BECOMES AIR",
    "price": 290.00,
    "publisher": "VINTAGE"
  },
  {
   "sku": "9780755322824",
    "name": "STARDUST",
    "price": 182.75,
    "publisher": "HEADLINE"
  }
]

If you want four documents in elasticsearch then add

split { field=> "parsedJson" }

to your filter section, after the json filter.

Hi @Badger

I tried that option but no luck

input {
  file {
    type => "json"
    path => "C:/Development/LogStashData/sample.json"
    start_position => "beginning" 
	codec => multiline{
	pattern =>"json"
	what => "previous" 
	negate => true
	auto_flush_interval =>1
	}
}
}

filter {
  json {
    source => "message"	
	tag_on_failure => [ "_jsonparsefailure" ]
	target => "parsedJson"
	}
  split { field=> "parsedJson" }
}

output {
  stdout { codec => rubydebug }
  elasticsearch{
	hosts => "http://localhost:9200"
    index => "testindex"
	}
}

In Kibana, on the Discover page, if you expand one of the events and switch to the JSON tab then what does the JSON look like?

Hi @Badger,
the elastic head showing below for my index, which is first document from json input.

     {
"_index": "testindex",
"_type": "_doc",
"_id": "uoN1inUBKkR_FoPHNjV1",
"_version": 1,
"_score": 1,
"_source": {
"tags": [
"multiline"
,
"_jsonparsefailure"
,
"_split_type_failure"
],
"@version": "1",
"path": "C:/Development/LogStashData/sample.json",
"message": "[ { "sku": "9781984806338", "name": "THE CHOSEN ONE", "price": 290.00, "publisher": "PENGUIN ADULT" }, { "sku": "9781775847410", "name": "FIRST FIELD GUIDE TO MUSHROOMS OF SOUTHERN AFRICA", "price": 80.00, "publisher": "STRUIK" }, { "sku": "9781784701994", "name": "WHEN BREATH BECOMES AIR", "price": 290.00, "publisher": "VINTAGE" }, { "sku": "9780755322824", "name": "STARDUST", "price": 182.75, "publisher": "HEADLINE" }",
"host": "SDontula-PC",
"type": "json",
"@timestamp": "2020-11-02T19:35:38.175Z"
}
}

The split failed because the json filter failed. The message field is not valid JSON. In this specific case you can fix it using

mutate { gsub => [ "message", "$", "]" ] }

Hi @Badger, Thank you
But unfortunately it is still not writing separate document. Below is the updated config. Please advise.

input {
  file {
    type => "json"
    path => "C:/Development/LogStashData/sample.json"
    start_position => "beginning" 
	codec => multiline{
	pattern =>"json"
	what => "previous" 
	negate => true
	auto_flush_interval =>1
	}
}
}

filter {
  json {
    source => "message"	
	tag_on_failure => [ "_jsonparsefailure" ]
	target => "parsedJson"
	}
  mutate { gsub => [ "message", "$", "]" ] }
  split { field=> "parsedJson" }
}

output {
  stdout { codec => rubydebug }
  elasticsearch{
	hosts => "http://localhost:9200"
    index => "testindex"
	}
}