Log File is not being created with logstash output config

Hi All,

I am trying to create a log file(txt) by configuring logstash output section with codec => json_lines, logstash is running successful but file is not been created . Mean time i am not doing any filters, once i started logstash just want to create a log file is my scenario

input {

file {
    path => "C:\LogStash_Log_Input\LogStashInputLog.log"
    start_position => "beginning"
codec => "plain"
}

}
filter {

}

output {
file {
path => "C:\LogStash\LogstashPortalLogs/Roche_%{+YYYY-MM-dd}.log"
codec => json_lines
}
}

I have verified about plugins the following are already available

logstash-output-file & logstash-codec-json_lines

I would like to request you that please suggest me what i missed out.

Thanks & Regards,

DG Murthy

It's quite possible that the File Input Plugin has already processed the files and remembers doing so via its sincedb. Can you find and delete the sincedb, and then try again? By default, it's in <path.data>/plugins/inputs/file.

Otherwise, what are the logs from Logstash telling you? Can you start Logstash with --log.level=debug?

Thank you so much for your reply . I have been verified sincedb and it contains 0 0 0 0

I have configured logstash.yml and pipelines.yml file

path.config: "/Logstash/bin/LogFileCreate.conf"

My logfilecreate.conf is see the below

input {
# Monitor a file. The path can be network or local
file {
path => "C:\LogStash_Log_Input/LogStashInputLog.log"
start_position => "beginning"
codec => json_lines

}

}
filter {
grok {
match => { "severity" => "warning" }
add_tag => ["warning"]
}
}

output {

if "warning" in [tags] {
file {

  		path => "C:\LogStash\LogstashPortalLogs/Roche_%{+YYYY-MM-dd}.log"
		
	codec => json_lines
  }
 }
 else {
            stdout {
                    codec => rubydebug
            }
    }

}

When i run the logstash from windows command prompt
C:\LogStash\Bin\logstash -f LogFileCreate.conf the logs is showing as below

[2018-07-18T12:54:05,968][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/LogStash/modules/fb_apache/configuration"}
[2018-07-18T12:54:05,983][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/LogStash/modules/netflow/configuration"}
[2018-07-18T12:54:06,244][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-07-18T12:54:06,880][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.3"}
[2018-07-18T12:54:08,127][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-07-18T12:54:13,403][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"mylogs", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-07-18T12:54:14,629][INFO ][logstash.pipeline ] Pipeline started succesfully {:pipeline_id=>"mylogs", :thread=>"#<Thread:0x11152fed run>"}
[2018-07-18T12:54:14,702][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["mylogs"]}
But in the output the log file is not creating with date stamp,the folder is empty.

Please kindly look into this and help me.

Thanks & Regards,
DG Murthy

If you could paste pre-formatted text like configs and log output in-between code fences (~~~), it would really help to make your posts on this forum more readable -- there is also a preview pane that help you see what your post will look like before you send it.


Can you repeat with debug logging enabled? The command-line flag --log.level debug is useful here.

I can understand your concerns, sorry for that i am new to this forum and new to logstash . After i enabled debug please see below log file info.

[2018-07-19T10:30:49,034][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/LogStash/modules/fb_apache/configuration"}
[2018-07-19T10:30:49,043][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x13eb97d2 @module_name="fb_apache", @directory="C:/LogStash/modules/fb_apache/configuration", @kibana_version_parts=["6", "0", "0"]>}
[2018-07-19T10:30:49,053][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/LogStash/modules/netflow/configuration"}
[2018-07-19T10:30:49,054][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x1c45f4c @module_name="netflow", @directory="C:/LogStash/modules/netflow/configuration", @kibana_version_parts=["6", "0", "0"]>}
[2018-07-19T10:30:49,263][DEBUG][logstash.runner ] -------- Logstash Settings (* means modified) ---------
[2018-07-19T10:30:49,264][DEBUG][logstash.runner ] *node.name: "test" (default: "RDR-VM572")
[2018-07-19T10:30:49,264][DEBUG][logstash.runner ] *path.config: "logfilecreate.conf"
[2018-07-19T10:30:49,264][DEBUG][logstash.runner ] path.data: "C:/LogStash/data"
[2018-07-19T10:30:49,265][DEBUG][logstash.runner ] modules.cli: []
[2018-07-19T10:30:49,265][DEBUG][logstash.runner ] modules: []
[2018-07-19T10:30:49,265][DEBUG][logstash.runner ] modules_setup: false
[2018-07-19T10:30:49,265][DEBUG][logstash.runner ] config.test_and_exit: false
[2018-07-19T10:30:49,266][DEBUG][logstash.runner ] config.reload.automatic: false
[2018-07-19T10:30:49,266][DEBUG][logstash.runner ] config.reload.interval: 3000000000
[2018-07-19T10:30:49,267][DEBUG][logstash.runner ] config.support_escapes: false
[2018-07-19T10:30:49,267][DEBUG][logstash.runner ] metric.collect: true
[2018-07-19T10:30:49,267][DEBUG][logstash.runner ] *pipeline.id: "mylogs" (default: "main")
[2018-07-19T10:30:49,268][DEBUG][logstash.runner ] pipeline.system: false
[2018-07-19T10:30:49,268][DEBUG][logstash.runner ] *pipeline.workers: 1 (default: 2)
[2018-07-19T10:30:49,268][DEBUG][logstash.runner ] pipeline.output.workers: 1
[2018-07-19T10:30:49,269][DEBUG][logstash.runner ] pipeline.batch.size: 125
[2018-07-19T10:30:49,269][DEBUG][logstash.runner ] pipeline.batch.delay: 50
[2018-07-19T10:30:49,269][DEBUG][logstash.runner ] pipeline.unsafe_shutdown: false
[2018-07-19T10:30:49,269][DEBUG][logstash.runner ] pipeline.java_execution: false
[2018-07-19T10:30:49,270][DEBUG][logstash.runner ] pipeline.reloadable: true
[2018-07-19T10:30:49,270][DEBUG][logstash.runner ] path.plugins: []
[2018-07-19T10:30:49,270][DEBUG][logstash.runner ] *config.debug: true (default: false)
[2018-07-19T10:30:49,270][DEBUG][logstash.runner ] *log.level: "debug" (default: "info")
[2018-07-19T10:30:49,271][DEBUG][logstash.runner ] version: false
[2018-07-19T10:30:49,271][DEBUG][logstash.runner ] help: false
[2018-07-19T10:30:49,271][DEBUG][logstash.runner ] log.format: "plain"
[2018-07-19T10:30:49,271][DEBUG][logstash.runner ] http.host: "127.0.0.1"
[2018-07-19T10:30:49,272][DEBUG][logstash.runner ] http.port: 9600..9700
[2018-07-19T10:30:49,272][DEBUG][logstash.runner ] http.environment: "production"
[2018-07-19T10:30:49,272][DEBUG][logstash.runner ] queue.type: "memory"
[2018-07-19T10:30:49,273][DEBUG][logstash.runner ] queue.drain: false
[2018-07-19T10:30:49,273][DEBUG][logstash.runner ] queue.page_capacity: 67108864
[2018-07-19T10:30:49,273][DEBUG][logstash.runner ] queue.max_bytes: 1073741824
[2018-07-19T10:30:49,273][DEBUG][logstash.runner ] queue.max_events: 0
[2018-07-19T10:30:49,274][DEBUG][logstash.runner ] queue.checkpoint.acks: 1024
[2018-07-19T10:30:49,274][DEBUG][logstash.runner ] queue.checkpoint.writes: 1024
[2018-07-19T10:30:49,274][DEBUG][logstash.runner ] queue.checkpoint.interval: 1000
[2018-07-19T10:30:49,274][DEBUG][logstash.runner ] dead_letter_queue.enable: false
[2018-07-19T10:30:49,275][DEBUG][logstash.runner ] dead_letter_queue.max_bytes: 1073741824
[2018-07-19T10:30:49,275][DEBUG][logstash.runner ] slowlog.threshold.warn: -1
[2018-07-19T10:30:49,275][DEBUG][logstash.runner ] slowlog.threshold.info: -1
[2018-07-19T10:30:49,276][DEBUG][logstash.runner ] slowlog.threshold.debug: -1
[2018-07-19T10:30:49,276][DEBUG][logstash.runner ] slowlog.threshold.trace: -1
[2018-07-19T10:30:49,276][DEBUG][logstash.runner ] keystore.classname: "org.logstash.secret.store.backend.JavaKeyStore"
[2018-07-19T10:30:49,276][DEBUG][logstash.runner ] keystore.file: "C:/LogStash/config/logstash.keystore"
[2018-07-19T10:30:49,277][DEBUG][logstash.runner ] path.queue: "C:/LogStash/data/queue"
[2018-07-19T10:30:49,277][DEBUG][logstash.runner ] path.dead_letter_queue: "C:/LogStash/data/dead_letter_queue"
[2018-07-19T10:30:49,277][DEBUG][logstash.runner ] path.settings: "C:/LogStash/config"
[2018-07-19T10:30:49,278][DEBUG][logstash.runner ] path.logs: "C:/LogStash/logs"
[2018-07-19T10:30:49,278][DEBUG][logstash.runner ] --------------- Logstash Settings -------------------
[2018-07-19T10:30:49,326][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-07-19T10:30:49,404][DEBUG][logstash.agent ] Setting up metric collection
[2018-07-19T10:30:49,496][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
[2018-07-19T10:30:49,667][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
[2018-07-19T10:30:49,840][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-19T10:30:49,860][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-19T10:30:49,876][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2018-07-19T10:30:49,888][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2018-07-19T10:30:49,937][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.3"}

[2018-07-19T10:30:50,366][DEBUG][logstash.config.pipelineconfig] Merged config
[2018-07-19T10:30:50,370][DEBUG][logstash.config.pipelineconfig]

input {
# Monitor a file. The path can be network or local
file {
path => "C:\LogStash_Log_Input/LogStashInputLog.log"
start_position => "beginning"
codec => json_lines

}

}
filter {
grok {
match => { "severity" => "warning" }
add_tag => ["warning"]
}
}

output {

if "warning" in [tags] {
file {

  		path => "C:\LogStash\LogstashPortalLogs/Roche_%{+YYYY-MM-dd}.log"
		
	codec => json_lines
  }
 }
 else {
            stdout {
                    codec => rubydebug
            }
    }

}

[2018-07-19T10:30:50,424][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-07-19T10:30:50,435][DEBUG][logstash.agent ] Converging pipelines state {:actions_count=>1}
[2018-07-19T10:30:50,494][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:mylogs}
[2018-07-19T10:30:53,334][DEBUG][logstash.pipeline ] Compiled pipeline code {:pipeline_id=>"mylogs", :code=>" @inputs = []\n @filters = []\n @outputs = []\n @periodic_flushers = []\n @shutdown_flushers = []\n @generated_objects = {}\n\n @generated_objects[:input_file_1] = plugin("input", "file", 4, 5, LogStash::Util.hash_merge_many({ "path" => ("C:\\LogStash_Log_Input/LogStashInputLog.log") }, { "start_position" => ("beginning") }, { "codec" => ("json_lines") }))\n\n @inputs << @generated_objects[:input_file_1]\n\n @generated_objects[:filter_grok_2] = plugin("filter", "grok", 13, 9, LogStash::Util.hash_merge_many({ "match" => {("severity") => ("warning")} }, { "add_tag" => [("warning")] }))\n\n @filters << @generated_objects[:filter_grok_2]\n\n @generated_objects[:filter_grok_2_flush] = lambda do |options, &block|\n @logger.debug? && @logger.debug("Flushing", :plugin => @generated_objects[:filter_grok_2])\n\n events = @generated_objects[:filter_grok_2].flush(options)\n\n return if events.nil? || events.empty?\n\n @logger.debug? && @logger.debug("Flushing", :plugin => @generated_objects[:filter_grok_2], :events => events.map { |x| x.to_hash })\n\n \n\n events.each{|e| block.call(e)}\n end\n\n if @generated_objects[:filter_grok_2].respond_to?(:flush)\n @periodic_flushers << @generated_objects[:filter_grok_2_flush] if @generated_objects[:filter_grok_2].periodic_flush\n @shutdown_flushers << @generated_objects[:filter_grok_2_flush]\n end\n\n @generated_objects[:output_file_3] = plugin("output", "file", 23, 8, LogStash::Util.hash_merge_many({ "path" => ("C:\\LogStash\\LogstashPortalLogs/Roche_%{+YYYY-MM-dd}.log") }, { "codec" => ("json_lines") }))\n\n @outputs << @generated_objects[:output_file_3]\n\n @generated_objects[:output_stdout_4] = plugin("output", "stdout", 31, 17, LogStash::Util.hash_merge_many({ "codec" => ("rubydebug") }))\n\n @outputs << @generated_objects[:output_stdout_4]\n\n define_singleton_method :filter_func do |event|\n events = event\n @logger.debug? && events.each { |e| @logger.debug("filter received", "event" => e.to_hash)}\n events = @generated_objects[:filter_grok_2].multi_filter(events)\n \n events\n end\n define_singleton_method :output_func do |event|\n targeted_outputs = []\n @logger.debug? && @logger.debug("output received", "event" => event.to_hash)\n if (((x = event.get("[tags]"); x.respond_to?(:include?) && x.include?(("warning"))))) # if "warning" in [tags]\n targeted_outputs << @generated_objects[:output_file_3]\n \n else\n targeted_outputs << @generated_objects[:output_stdout_4]\n \n \n end\n \n targeted_outputs\n end"}
[2018-07-19T10:30:53,446][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"file", :type=>"input", :class=>LogStash::Inputs::File}

[2018-07-19T10:31:21,497][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-19T10:31:21,498][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-19T10:31:23,151][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"mylogs", :thread=>"#<Thread:0x652baf36 sleep>"}
[2018-07-19T10:31:26,503][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-19T10:31:26,504][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-19T10:31:27,449][DEBUG][logstash.inputs.file ] _globbed_files: C:\LogStash_Log_Input/LogStashInputLog.log: glob is: []
[2018-07-19T10:31:28,153][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"mylogs", :thread=>"#<Thread:0x652baf36 sleep>"}
[2018-07-19T10:31:31,512][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-19T10:31:31,513][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-19T10:31:33,153][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"mylogs", :thread=>"#<Thread:0x652baf36 sleep>"}
[2018-07-19T10:31:36,521][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-19T10:31:36,521][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-19T10:31:38,155][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"mylogs", :thread=>"#<Thread:0x652baf36 sleep>"}
[2018-07-19T10:31:41,525][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-19T10:31:41,526][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-19T10:31:42,461][DEBUG][logstash.inputs.file ] _globbed_files: C:\LogStash_Log_Input/LogStashInputLog.log: glob is: []
[2018-07-19T10:31:43,155][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"mylogs", :thread=>"#<Thread:0x652baf36 sleep>"}
[2018-07-19T10:31:46,529][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-19T10:31:46,530][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-19T10:31:48,156][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"mylogs", :thread=>"#<Thread:0x652baf36 sleep>"}
[2018-07-19T10:31:51,534][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-19T10:31:51,535][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-19T10:31:53,156][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"mylogs", :thread=>"#<Thread:0x652baf36 sleep>"}
[2018-07-19T10:31:56,549][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-19T10:31:56,550][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-19T10:31:57,475][DEBUG][logstash.inputs.file ] _globbed_files: C:\LogStash_Log_Input/LogStashInputLog.log: glob is: []
[2018-07-19T10:31:58,158][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"mylogs", :thread=>"#<Thread:0x652baf36 sleep>"}
[2018-07-19T10:32:01,553][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-19T10:32:01,554][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-19T10:32:03,160][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"mylogs", :thread=>"#<Thread:0x652baf36 sleep>"}

i faced the same issue but i tried if "_grokparsefailure" in [tags] instead of the warning and removed the filter tag.It worked. And please ensure the output directory has the read,write permissions

Here i am not able to post its saying that sorry you can put 10 users only, even i am only replying to you

Thans alot for your quick reply. But i am getting the following error after i removing filter, now i dont implemented any filters and i am running with administrator priveleges
[2018-07-19T11:31:06,775][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@id = "3d2bb7323968593b5bb2c8863e3db059ede2b8e8493943eee41204488938bd0f"
[2018-07-19T11:31:06,775][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@enable_metric = true
[2018-07-19T11:31:06,776][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@workers = 1
[2018-07-19T11:31:06,777][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@flush_interval = 2
[2018-07-19T11:31:06,777][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@gzip = false
[2018-07-19T11:31:06,778][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@filename_failure = "_filepath_failures"

Please find my config file.

input {

stdin {}

file {
path => "/home/files/value.log"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

output {
file {
path => "/home/files/value1.log"
}
}

So it created a new file in home/files folder

So your conf file would be the same

input {

stdin {}

file {
path => "C:\LogStash_Log_Input/LogStashInputLog.log"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

output {
file {
path => "C:\LogStash\LogstashPortalLogs/Roche_%{+YYYY-MM-dd}.log"
}
}

Thank you so much for your quick reply. i did the same what you are sent to me, sorry still i am getting the following error.

[2018-07-19T14:01:53,064][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@path = "C:\LogStash\LogstashPortalLogs/Roche_%{+YYYY-MM-dd}.log"
[2018-07-19T14:01:53,065][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@id = "4299277eca132b794d51a1bf4782e5cb1c142eeffb53cf29c8b22e1147e0ee4b"
[2018-07-19T14:01:53,065][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@enable_metric = true
[2018-07-19T14:01:53,066][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@codec = <LogStash::Codecs::JSONLines id=>"json_lines_800a41ca-fa01-440d-829b-469d5285215f", enable_metric=>true, charset=>"UTF-8", delimiter=>"\n">
[2018-07-19T14:01:53,066][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@workers = 1
[2018-07-19T14:01:53,066][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@flush_interval = 2
[2018-07-19T14:01:53,066][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@gzip = false
[2018-07-19T14:01:53,066][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@filename_failure = "_filepath_failures"

Please ensure that your file path is correct

path => "C:\LogStash\LogstashPortalLogs\Roche_%{+YYYY-MM-dd}.log" . Please check the slashes i think you have given the backward slashes in the last.

Thanks for your reply. now i tried as per your suggestions but sorry again same issue. i am trying on windows 10 OS, what is your OS?

I am doing in linux centos.has the file been created ? is the folder logstash/logstashportallogs exist?

Thanks for your information. Yes the folder exists and i tried with backword slashes/forward slashes and only c drive path but in all scenarios same issue persists.

ok. I have my ELK in lunx only I dont have a setup in windows.[2018-07-19T14:01:53,066][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@filename_failure = "_filepath_failures" is the error is same?

can u send ur full configuration file

See the below log info

[2018-07-20T07:52:18,778][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@path = "C:/LogStash/xyz_%{+YYYY-MM-dd}.log"
[2018-07-20T07:52:18,778][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@id = "82428c93704b399a3b742d774dafe46833c6062e02c200d7028dc7418a6f2db6"
[2018-07-20T07:52:18,779][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@enable_metric = true
[2018-07-20T07:52:18,779][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@codec = <LogStash::Codecs::JSONLines id=>"json_lines_d531a3cd-e272-4001-b82d-34b9a1f28a6c", enable_metric=>true, charset=>"UTF-8", delimiter=>"\n">
[2018-07-20T07:52:18,788][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@workers = 1
[2018-07-20T07:52:18,789][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@flush_interval = 2
[2018-07-20T07:52:18,789][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@gzip = false
[2018-07-20T07:52:18,789][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@filename_failure = "_filepath_failures"

Please see below my complete config file info

input {

stdin {}
file {
path => "C:\LogStash_Log_Input/LogStashInputLog.log"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

output {
file {
path => "C:/LogStash/xyz_%{+YYYY-MM-dd}.log"
}
}