I tried to interpret the logs. It's actually processing the file. But wondering why it's not displaying it.
Please check whether the below log helps you in finding the issue.
org.logstash.config.ir.compiler.ComputeStepSyntaxElement@c96a6eaa
[2019-06-19T11:27:51,478][TRACE][logstash.inputs.file ] Registering file input {:path=>["C:/Ashok/logstash-7.1.1/files/inputFile1.txt"]}
[2019-06-19T11:27:51,529][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-06-19T11:27:51,538][DEBUG][logstash.javapipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x790907fd run>"}
[2019-06-19T11:27:51,551][DEBUG][org.logstash.execution.PeriodicFlush] Pushing flush onto pipeline.
[2019-06-19T11:27:51,576][TRACE][logstash.agent ] Converge results {:success=>true, :failed_actions=>[], :successful_actions=>["id: main, action_type: LogStash::PipelineAction::Create"]}
[2019-06-19T11:27:51,621][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-06-19T11:27:51,621][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-06-19T11:27:51,656][DEBUG][logstash.agent ] Starting puma
[2019-06-19T11:27:51,673][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2019-06-19T11:27:51,674][TRACE][filewatch.sincedbcollection] open: reading from NUL
[2019-06-19T11:27:51,682][TRACE][filewatch.sincedbcollection] open: count of keys read: 0
[2019-06-19T11:27:51,738][TRACE][filewatch.discoverer ] discover_files {"count"=>1}
[2019-06-19T11:27:51,743][DEBUG][logstash.api.service ] [api-service] start
[2019-06-19T11:27:51,839][TRACE][filewatch.discoverer ] discover_files handling: {"new discovery"=>true, "watched_file details"=>"<FileWatch::WatchedFile:
@filename='inputFile1.txt', @state='watched', @recent_states='[:watched]', @bytes_read='0', @bytes_unread='0', current_size='24', last_stat_size='24', file_open?='false', @initial=true, @sincedb_key='2350254784-3395-1179648 0 0'>"}
[2019-06-19T11:27:51,844][TRACE][filewatch.sincedbcollection] associate: finding {"inode"=>"2350254784-3395-1179648", "path"=>"C:/Ashok/logstash-7.1.1/files/inputFile1.txt"}
[2019-06-19T11:27:51,847][TRACE][filewatch.sincedbcollection] associate: unmatched
[2019-06-19T11:27:51,909][TRACE][filewatch.tailmode.processor] Delayed Delete processing
[2019-06-19T11:27:51,917][TRACE][filewatch.tailmode.processor] Watched + Active restat processing
[2019-06-19T11:27:51,943][TRACE][filewatch.tailmode.processor] Rotation In Progress processing
[2019-06-19T11:27:51,950][TRACE][filewatch.tailmode.processor] Watched processing
[2019-06-19T11:27:51,964][TRACE][filewatch.tailmode.handlers.createinitial] handling: inputFile1.txt
[2019-06-19T11:27:51,980][TRACE][filewatch.tailmode.handlers.createinitial] opening inputFile1.txt
[2019-06-19T11:27:51,998][TRACE][filewatch.tailmode.handlers.createinitial] handle_specifically opened file handle: 100470, path: inputFile1.txt
[2019-06-19T11:27:52,021][TRACE][filewatch.tailmode.handlers.createinitial] add_new_value_sincedb_collection {"position"=>0, "watched_file details"=>"<FileWatch::WatchedFile: @filename='inputFile1.txt', @state='active', @recent_states='[:watched, :watched]', @bytes_read='0', @bytes_unread='24', current_size='24', last_stat_size='24', file_open?='true', @initial=true, @sincedb_key='2350254784-3395-1179648 0 0'>"}
[2019-06-19T11:27:52,038][TRACE][filewatch.tailmode.processor] Active - file grew: inputFile1.txt: new size is 24, bytes read 0
[2019-06-19T11:27:52,041][TRACE][filewatch.tailmode.handlers.grow] handling: inputFile1.txt
[2019-06-19T11:27:52,077][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-06-19T11:27:52,095][TRACE][filewatch.tailmode.handlers.grow] reading... {"iterations"=>1, "amount"=>24, "filename"=>"inputFile1.txt"}
[2019-06-19T11:27:52,097][DEBUG][filewatch.tailmode.handlers.grow] read_to_eof: get chunk
[2019-06-19T11:27:52,115][TRACE][filewatch.tailmode.handlers.grow] buffer_extract: a delimiter can't be found in current chunk, maybe there are no more delimiters or the delimiter is incorrect or the text before the delimiter, a 'line', is very large, if this message is logged often try increasing the `file_chunk_size` setting. {"delimiter"=>"\n", "read_position"=>0, "bytes_read_count"=>24, "last_known_file_size"=>24, "file_path"=>"C:/Ashok/logstash-7.1.1/files/inputFile1.txt"}
[2019-06-19T11:27:52,128][DEBUG][filewatch.sincedbcollection] writing sincedb (delta since last write = 1560968872)
[2019-06-19T11:27:52,132][TRACE][filewatch.sincedbcollection] sincedb_write: to: NUL