Log stash - fetching data from file - not getting the output in the console

I am trying to use the file as input source. But I am not getting the file content in the stdout.

Command

./logstash -f ../conf/file.conf

Configuration File

#sample configuration 1
#file.conf

input
{ file{ 
    path => "C:\Ashok\logstash-7.1.1\files\example.txt"
 }
}
 output 
{ 
 stdout { 
 codec => rubydebug 
 }
}

LogStash Log:-

PS C:\Ashok\logstash-7.1.1\bin> ./logstash -f ../conf/file.conf
Sending Logstash logs to C:/Ashok/logstash-7.1.1/logs which is now configured via log4j2.properties
[2019-05-29T21:11:08,309][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-05-29T21:11:08,328][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2019-05-29T21:11:14,294][INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, :thread=>"#<Thread:0x52e14958 run>"}
[2019-05-29T21:11:14,998][INFO ][logstash.inputs.file     ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/Ashok/logstash-7.1.1/data/plugins/inputs/file/.sincedb_85a3253599fdf570c96274a4a908b138", :path=>["C:\\Ashok\\logstash-7.1.1\\files\\example.txt"]}
[2019-05-29T21:11:15,034][INFO ][logstash.javapipeline    ] Pipeline started {"pipeline.id"=>"main"}
[2019-05-29T21:11:15,121][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-05-29T21:11:15,127][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2019-05-29T21:11:15,491][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Hello there,

You Need to specify the start_postion => beginning

Look at the fiel Input description and also read about the sincedb. This will help you in the furure

I added in conf file. Still not able to see the file content in stdout

#sample configuration 1
#file.conf

input
{ file{ 
    path => "C:\Ashok\logstash-7.1.1\files\example.txt"
	start_position => "beginning"  
 }
}
 output 
{ 
 stdout { 
 codec => rubydebug 
 }
}

add a line in the file while logstash is running.

That´s why you should read the docs about sincedb.
Because inside the sincedb file is an information about where logstash offset is set for each file.

Your offset is at the end of the file because you have started it first without the startposition. default is "end". now you have started it at the beginning. BUT logstash looks at the offset inside the sincedb file and recognizes that it is at the end.

With that logstash helps you not to read files/lines twice.

so either add a line at the end of the file.
or close logstash, delete the sincedb file and restart logstash

Thank you for the explanation about sincedb.

I removed the sincedb file, restarted the logstash, still not working (not getting displayed in the console).

Later, I tried to send it to elastic search, not working.

input
{ file{ 
    path => "C:\Ashok\logstash-7.1.1\files\example.txt"
	start_position => "beginning"  
 }
}
 output 
{ 
elasticsearch {
      index => "Test"
      document_type => "File"
      hosts => "localhost:9200"
    }
}

Change your file path. Try using / instead of \.

Thank you Staale by changing the path, it worked

Any thing wrong with the below conf?

I am not seeing the output.

input
{ file{ 
    path => "C:/Demo/logstash-7.1.1/files/conf1.txt"
	start_position => "beginning"
	sincedb_path => "NULL"	
 }
}

filter{

	 grok {
     match => [ "message", "%{GREEDYDATA:data}" ]
   }

}


 output 
{ 
stdout{
		codec => rubydebug
	}
}

That should be NUL, not NULL.

Tried it. But didn't work.

PS C:\Ashok\logstash-7.1.1\bin> ./logstash -f ..\conf\conf1.conf
Sending Logstash logs to C:/Ashok/logstash-7.1.1/logs which is now configured via log4j2.properties
[2019-06-19T10:02:03,600][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-06-19T10:02:03,618][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2019-06-19T10:02:10,022][INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, :thread=>"#<Thread:0x7c210d6f run>"}
[2019-06-19T10:02:10,723][INFO ][logstash.javapipeline    ] Pipeline started {"pipeline.id"=>"main"}
[2019-06-19T10:02:10,808][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-06-19T10:02:10,815][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2019-06-19T10:02:11,276][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Enable '--log.level trace' and see what filewatch has to say.

Here you go!

   [2019-06-19T11:27:50,498][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
    [2019-06-19T11:27:50,537][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
    [2019-06-19T11:27:50,541][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
    [2019-06-19T11:27:50,593][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@codec = <LogStash::Codecs::RubyDebug id=>"rubydebug_7be1faaa-b97b-4351-a155-569115b6f63b", enable_metric=>true, metadata=>false>
    [2019-06-19T11:27:50,594][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@id = "fa8cc5986c080021fbc9a8d561b69a464cf31e7a288c6ac8eb08606feee109cd"
    [2019-06-19T11:27:50,594][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@enable_metric = true
    [2019-06-19T11:27:50,595][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@workers = 1
    [2019-06-19T11:27:50,631][DEBUG][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main"}
    [2019-06-19T11:27:50,673][INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, :thread=>"#<Thread:0x790907fd run>"}
    [2019-06-19T11:27:50,734][DEBUG][org.logstash.config.ir.CompiledPipeline] Compiled output
     P[output-stdout{"codec"=>"rubydebug"}|[str]pipeline:16:1:```
    stdout{
                    codec => rubydebug
            }
    ```]
     into
     org.logstash.config.ir.compiler.ComputeStepSyntaxElement@c96a6eaa
    [2019-06-19T11:27:50,735][DEBUG][org.logstash.config.ir.CompiledPipeline] Compiled output
     P[output-stdout{"codec"=>"rubydebug"}|[str]pipeline:16:1:```
    stdout{
                    codec => rubydebug
            }
    ```]
     into

filewatch logs at trace level, not debug. You are looking for messages like these. If you are unable to interpret them yourself they will be too large to post here, so you can use your favorite internet site that supports posting logs.

I tried to interpret the logs. It's actually processing the file. But wondering why it's not displaying it.

Please check whether the below log helps you in finding the issue.

 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@c96a6eaa
[2019-06-19T11:27:51,478][TRACE][logstash.inputs.file     ] Registering file input {:path=>["C:/Ashok/logstash-7.1.1/files/inputFile1.txt"]}
[2019-06-19T11:27:51,529][INFO ][logstash.javapipeline    ] Pipeline started {"pipeline.id"=>"main"}
[2019-06-19T11:27:51,538][DEBUG][logstash.javapipeline    ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x790907fd run>"}
[2019-06-19T11:27:51,551][DEBUG][org.logstash.execution.PeriodicFlush] Pushing flush onto pipeline.
[2019-06-19T11:27:51,576][TRACE][logstash.agent           ] Converge results {:success=>true, :failed_actions=>[], :successful_actions=>["id: main, action_type: LogStash::PipelineAction::Create"]}
[2019-06-19T11:27:51,621][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2019-06-19T11:27:51,621][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-06-19T11:27:51,656][DEBUG][logstash.agent           ] Starting puma
[2019-06-19T11:27:51,673][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2019-06-19T11:27:51,674][TRACE][filewatch.sincedbcollection] open: reading from NUL
[2019-06-19T11:27:51,682][TRACE][filewatch.sincedbcollection] open: count of keys read: 0
[2019-06-19T11:27:51,738][TRACE][filewatch.discoverer     ] discover_files {"count"=>1}
[2019-06-19T11:27:51,743][DEBUG][logstash.api.service     ] [api-service] start
[2019-06-19T11:27:51,839][TRACE][filewatch.discoverer     ] discover_files handling: {"new discovery"=>true, "watched_file details"=>"<FileWatch::WatchedFile:
@filename='inputFile1.txt', @state='watched', @recent_states='[:watched]', @bytes_read='0', @bytes_unread='0', current_size='24', last_stat_size='24', file_open?='false', @initial=true, @sincedb_key='2350254784-3395-1179648 0 0'>"}
[2019-06-19T11:27:51,844][TRACE][filewatch.sincedbcollection] associate: finding {"inode"=>"2350254784-3395-1179648", "path"=>"C:/Ashok/logstash-7.1.1/files/inputFile1.txt"}
[2019-06-19T11:27:51,847][TRACE][filewatch.sincedbcollection] associate: unmatched
[2019-06-19T11:27:51,909][TRACE][filewatch.tailmode.processor] Delayed Delete processing
[2019-06-19T11:27:51,917][TRACE][filewatch.tailmode.processor] Watched + Active restat processing
[2019-06-19T11:27:51,943][TRACE][filewatch.tailmode.processor] Rotation In Progress processing
[2019-06-19T11:27:51,950][TRACE][filewatch.tailmode.processor] Watched processing
[2019-06-19T11:27:51,964][TRACE][filewatch.tailmode.handlers.createinitial] handling: inputFile1.txt
[2019-06-19T11:27:51,980][TRACE][filewatch.tailmode.handlers.createinitial] opening inputFile1.txt
[2019-06-19T11:27:51,998][TRACE][filewatch.tailmode.handlers.createinitial] handle_specifically opened file handle: 100470, path: inputFile1.txt
[2019-06-19T11:27:52,021][TRACE][filewatch.tailmode.handlers.createinitial] add_new_value_sincedb_collection {"position"=>0, "watched_file details"=>"<FileWatch::WatchedFile: @filename='inputFile1.txt', @state='active', @recent_states='[:watched, :watched]', @bytes_read='0', @bytes_unread='24', current_size='24', last_stat_size='24', file_open?='true', @initial=true, @sincedb_key='2350254784-3395-1179648 0 0'>"}
[2019-06-19T11:27:52,038][TRACE][filewatch.tailmode.processor] Active - file grew: inputFile1.txt: new size is 24, bytes read 0
[2019-06-19T11:27:52,041][TRACE][filewatch.tailmode.handlers.grow] handling: inputFile1.txt
[2019-06-19T11:27:52,077][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2019-06-19T11:27:52,095][TRACE][filewatch.tailmode.handlers.grow] reading... {"iterations"=>1, "amount"=>24, "filename"=>"inputFile1.txt"}
[2019-06-19T11:27:52,097][DEBUG][filewatch.tailmode.handlers.grow] read_to_eof: get chunk
[2019-06-19T11:27:52,115][TRACE][filewatch.tailmode.handlers.grow] buffer_extract: a delimiter can't be found in current chunk, maybe there are no more delimiters or the delimiter is incorrect or the text before the delimiter, a 'line', is very large, if this message is logged often try increasing the `file_chunk_size` setting. {"delimiter"=>"\n", "read_position"=>0, "bytes_read_count"=>24, "last_known_file_size"=>24, "file_path"=>"C:/Ashok/logstash-7.1.1/files/inputFile1.txt"}
[2019-06-19T11:27:52,128][DEBUG][filewatch.sincedbcollection] writing sincedb (delta since last write = 1560968872)
[2019-06-19T11:27:52,132][TRACE][filewatch.sincedbcollection] sincedb_write: to: NUL

This is the important line. It finds the file, reads 24 bytes from it, then leaves them in the buffer, waiting for a line terminator to be added to the file. Your file has no line terminator at the end.

@Badger you are right, I just added a carriage return(\r) in that file and the logstash started to push the data to ES and on console output.

Thank you for your help.

Is it mandate to have a file terminator or delimiter in a file? Is there a pre-defined list of delimiters?

Yes , it is mandatory. The documentation says the native line terminator is the default value.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.