Hi @Badger, thank you
I did set to trace.. but I can't find any valid info, please see if you can spot any issue.
Just to mention : My environment is Windows 10, 64 Bit.
[2020-11-02T19:49:12,885][TRACE][logstash.codecs.multiline] Registered multiline plugin {:type=>nil, :config=>{"pattern"=>"json", "what"=>"previous", "id"=>"f975f0bd-ed59-4325-bc28-8cd2c68a0417", "auto_flush_interval"=>1, "negate"=>true, "enable_metric"=>true, "patterns_dir"=>[], "charset"=>"UTF-8", "multiline_tag"=>"multiline", "max_lines"=>500, "max_bytes"=>10485760}}
[2020-11-02T19:49:12,938][DEBUG][logstash.inputs.file ] config LogStash::Inputs::File/@start_position = "beginning"
[2020-11-02T19:49:12,939][DEBUG][logstash.inputs.file ] config LogStash::Inputs::File/@path = ["C:/Development/LogStashData/sample.json"]
9:13,611][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "testindex"
[2020-11-02T19:49:13,620][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [http://localhost:9200]
[2020-11-02T19:49:14,487][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-11-02T19:49:14,574][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2020-11-02T19:49:14,642][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2020-11-02T19:49:14,685][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["C:/Development/logstash-7.9.3/config/test.conf"], :thread=>"#<Thread:0xdb26e03 run>"}
[2020-11-02T19:49:14,723][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-11-02T19:49:14,749][DEBUG][logstash.outputs.elasticsearch][main] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2020-11-02T19:49:14,768][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2020-11-02T19:49:14,956][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2020-11-02T19:49:14,961][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2020-11-02T19:49:15,415][DEBUG][org.logstash.config.ir.CompiledPipeline][main] Compiled filter
P[filter-json{"source"=>"message", "tag_on_failure"=>["_jsonparsefailure"], "target"=>"parsedJson"}|[file]C:/Development/logstash-7.9.3/config/test.conf:16:3:```
json {
source => "message"
tag_on_failure => [ "_jsonparsefailure" ]
target => "parsedJson"
}
```]
into
org.logstash.config.ir.compiler.ComputeStepSyntaxElement@53e5cb12
[2020-11-02T19:49:15,692][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>1.0}
[2020-11-02T19:49:16,398][TRACE][logstash.inputs.file ][main] Registering file input {:path=>["C:/Development/LogStashData/sample.json"]}
[2020-11-02T19:49:16,438][INFO ][logstash.inputs.file ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/Development/logstash-7.9.3/data/plugins/inputs/file/.sincedb_46a48097852ad9cac593d40f2d9e34d5", :path=>["C:/Development/LogStashData/sample.json"]}
[2020-11-02T19:49:16,465][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-11-02T19:49:16,476][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2020-11-02T19:49:16,478][DEBUG][logstash.javapipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0xdb26e03 run>"}
[2020-11-02T19:49:16,501][TRACE][logstash.agent ] Converge results {:success=>true, :failed_actions=>[], :successful_actions=>["id: main, action_type: LogStash::PipelineAction::Create"]}
[2020-11-02T19:49:16,522][INFO ][filewatch.observingtail ][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] START, creating Discoverer, Watch with file and sincedb collections
[2020-11-02T19:49:16,539][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-11-02T19:49:16,557][TRACE][filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] open: reading from C:/Development/logstash-7.9.3/data/plugins/inputs/file/.sincedb_46a48097852ad9cac593d40f2d9e34d5
[2020-11-02T19:49:16,580][DEBUG][logstash.agent ] Starting puma
[2020-11-02T19:49:16,591][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2020-11-02T19:49:16,595][TRACE][filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] open: importing ... '2330359544-569231-131072 0 0' => '507 1604339208.643 C:/Development/LogStashData/sample.json'
[2020-11-02T19:49:16,604][TRACE][filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] open: setting #<struct FileWatch::InodeStruct inode="2330359544-569231-131072", maj=0, min=0> to #<FileWatch::SincedbValue:0x2f6fa4c9 @last_changed_at=1604339208.643, @path_in_sincedb="C:/Development/LogStashData/sample.json", @watched_file=nil, @position=507>
[2020-11-02T19:49:16,608][TRACE][filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] open: count of keys read: 1
[2020-11-02T19:49:16,648][TRACE][filewatch.discoverer ][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] discover_files {:count=>1}
[2020-11-02T19:49:16,670][DEBUG][logstash.api.service ] [api-service] start
[2020-11-02T19:49:16,736][TRACE][filewatch.discoverer ][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] handling: {:new_discovery=>true, :watched_file=>"<FileWatch::WatchedFile: @filename='sample.json', @state=:watched, @recent_states=[:watched], @bytes_read=0, @bytes_unread=0, current_size=507, last_stat_size=507, file_open?=false, @initial=true, sincedb_key='2330359544-569231-131072 0 0'>"}
[2020-11-02T19:49:16,763][TRACE][filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] associate: finding {:path=>"C:/Development/LogStashData/sample.json", :inode=>"2330359544-569231-131072"}
[2020-11-02T19:49:16,772][TRACE][filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] associate: found sincedb record {:filename=>"sample.json", :sincedb_key=>#<struct FileWatch::InodeStruct inode="2330359544-569231-131072", maj=0, min=0>, :sincedb_value=>#<FileWatch::SincedbValue:0x2f6fa4c9 @last_changed_at=1604339208.643, @path_in_sincedb="C:/Development/LogStashData/sample.json", @watched_file=nil, @position=507>}
[2020-11-02T19:49:16,795][TRACE][filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] handle_association fully read, ignoring..... {:watched_file=>"<FileWatch::WatchedFile: @filename='sample.json', @state=:ignored, @recent_states=[:watched, :watched], @bytes_read=507, @bytes_unread=0, current_size=507, last_stat_size=507, file_open?=false, @initial=false, sincedb_key='2330359544-569231-131072 0 0'>", :sincedb_value=>#<FileWatch::SincedbValue:0x2f6fa4c9 @last_changed_at=1604339356.785, @path_in_sincedb="C:/Development/LogStashData/sample.json", @watched_file=<FileWatch::WatchedFile: @filename='sample.json', @state=:ignored, current_size=507, sincedb_key='2330359544-569231-131072 0 0'>, @position=507>}
[2020-11-02T19:49:16,803][TRACE][filewatch.sincedbcollection][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] associate: inode and path matched
[2020-11-02T19:49:16,837][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_closed
[2020-11-02T19:49:16,846][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_ignored
[2020-11-02T19:49:16,878][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_delayed_delete
[2020-11-02T19:49:16,887][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_restat_for_watched_and_active
[2020-11-02T19:49:16,896][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_rotation_in_progress
[2020-11-02T19:49:16,906][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_watched
[2020-11-02T19:49:16,915][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_active
[2020-11-02T19:49:17,084][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2020-11-02T19:49:19,792][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2020-11-02T19:49:19,994][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_closed
[2020-11-02T19:49:19,995][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_ignored
[2020-11-02T19:49:20,001][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_delayed_delete
[2020-11-02T19:49:20,003][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_restat_for_watched_and_active
[2020-11-02T19:49:20,005][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_rotation_in_progress
[2020-11-02T19:49:20,006][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_watched
[2020-11-02T19:49:20,007][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_active
[2020-11-02T19:49:20,030][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2020-11-02T19:49:20,032][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2020-11-02T19:49:21,017][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_closed
[2020-11-02T19:49:21,018][TRACE][filewatch.tailmode.processor][main]
[2020-11-02T19:49:21,021][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_restat_for_watched_and_active
[2020-11-02T19:49:21,024][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_rotation_in_progress
[2020-11-02T19:49:21,025][TRACE][filewatch.tailmode.processor][main]
[2020-11-02T19:49:21,477][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2020-11-02T19:49:22,031][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_closed
[2020-11-02T19:49:22,033][TRACE][filewatch.tailmode.processor][main][a87f64d14b57cf0ca455f5d5c4ba92d84a69557baa62fd6986a2ff5b4f535c04] process_ignored