Logstash not show any output (again)

I'am trying to use logstash for parsing rest-api logs into file, however the only output is
logstash -f ../conf/first-pipeline.conf
c:\ELK\logstash\bin>logstash -f ../conf/first-pipeline.conf io/console not supported; tty will not be manipulated Default settings used: Filter workers: 4 Logstash startup completed
logstash -f ../conf/first-pipeline.conf --debug
←[36mconfig LogStash::Codecs::Multiline/@max_bytes = 10485760 {:level=>:debug, :file=>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/config/mixin.rb", :line=>"128", :method=>"config_init"}←[0m ..... Logstash startup completed
and there is no output file.
My conf is
input { file { path => "/ELK/dbo/api/ups1.log" start_position => beginning type => "appl" } } filter { if [type] == "appl" { grok { match => { "message" => "%{TIME:time} %{WORD:tracelevel} %{WORD:traceclass},http--%{NOTSPACE:port}-%{NOTSPACE:flow} - %{NUMBER:requestID} \* %{GREEDYDATA:logtype}"} named_captures_only => true } } } output { file {path => "/elk/dbo/result/text.txt"} }
Input file example
23:24:23,118 INFO CustomLoggingFilter,http--0.0.0.0-8080-33:233 - 4011735 * Server out-bound response 23:24:23,056 INFO CustomLoggingFilter,http--0.0.0.0-8080-53:233 - 4011734 * Server out-bound response 23:24:22,978 INFO CustomLoggingFilter,http--0.0.0.0-8080-57:233 - 4011732 * Server out-bound response
What's wrong?

Hello Llalartu,

It is likely that although you have defined start_position => beginning in your configuration, you have not defined your sincedb, so logstash probably already read these lines once, and therefore will not send them again. To force it to start from the beginning again, you must remove the sincedb file, and if you want to run this configuration in testing repeatedly, always starting from the beginning of the file, you could set the sincedb_path to /dev/null.

Cheers,
-Robin-

Thank you for your response, but it didn't help.
After deleting .sincedb files i have the same problem.
Set conf to
input { file { path => "/ELK/dbo/api/ups1.log" start_position => "end" type => "appl" } } filter { if [type] == "appl" { grok { match => { "message" => "%{TIME:time} %{WORD:tracelevel} %{WORD:traceclass},http--%{NOTSPACE:port}-%{NOTSPACE:flow} - %{NUMBER:requestID} \* %{GREEDYDATA:logtype}"} named_captures_only => true } } } output { file {path => "/elk/dbo/result/text.txt"} }
and the same debug log:
... ←[32mWorker threads expected: 4, worker threads started: 4 {:level=>:info, :file =>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logst ash/pipeline.rb", :line=>"161", :method=>"start_filters"}←[0m ←[32mPipeline started {:level=>:info, :file=>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/pipeline.rb", :line=>"89", :method=>"run"}←[0m Logstash startup completed ←[36m_discover_file_glob: /ELK/dbo/api/ups1.log: glob is: ["/ELK/dbo/api/ups1.lo g"] {:level=>:debug, :file=>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/filewatch-0.6.5/lib/filewatch/watch.rb", :line=>"132", :method=>"_discover_file"}←[0m ←[36m_discover_file_glob: /ELK/dbo/api/ups1.log: glob is: ["/ELK/dbo/api/ups1.lo g"] {:level=>:debug, :file=>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/filewatch-0.6.5/lib/filewatch/watch.rb", :line=>"132", :method=>"_discover_file"}←[0m

Hello Lallartu,

Umm... now it seems you have changed to end: start_position => "end"
Could you try making that beginning again?

Cheers,
-Robin-

Yeah, sure.
Nothing changed:
input { file { path => "/ELK/dbo/api/ups1.log" start_position => "beginning" type => "appl" } } filter { if [type] == "appl" { grok { match => { "message" => "%{TIME:time} %{WORD:tracelevel} %{WORD:traceclass},http--%{NOTSPACE:port}-%{NOTSPACE:flow} - %{NUMBER:requestID} \* %{GREEDYDATA:logtype}"} named_captures_only => true } } } output { file {path => "/elk/dbo/result/text.txt"} }
logstash -f ../conf/first-pipeline.conf
c:\ELK\logstash\bin>logstash -f ../conf/first-pipeline.conf io/console not supported; tty will not be manipulated Default settings used: Filter workers: 4 Logstash startup completed
logstash -f ../conf/first-pipeline.conf --debug
`...
←[36mGrok compiled OK {:pattern=>"%{TIME:time} %{WORD:tracelevel} %{WORD:tracec
lass},http--%{NOTSPACE:port}-%{NOTSPACE:flow} - %{NUMBER:requestID} \* %{GREEDY
DATA:logtype}", :expanded_pattern=>"(?TIME:time(?!<[0-9])(?:(?:2[0123]|[01]?[0
-9])):(?:(?:[0-5][0-9]))(?::(?:(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?)))(?![0-9])
) (?WORD:tracelevel\b\w+\b) (?WORD:traceclass\b\w+\b),http--(?\S+)-(?NOTSPACE:flow\S+) - (?NUMBER:requestID(?:(?:(?<![0-9.+-])(?

[+-]?(?:(?:[0-9]+(?:\.[0-9]+)?)|(?:\.[0-9]+)))))) \* (?GREEDYDATA:logtype.
*)", :level=>:debug, :file=>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/jls-grok
-0.11.2/lib/grok-pure.rb", :line=>"128", :method=>"compile"}←[0m
←[32mWorker threads expected: 4, worker threads started: 4 {:level=>:info, :file
=>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logst
ash/pipeline.rb", :line=>"161", :method=>"start_filters"}←[0m
←[32mPipeline started {:level=>:info, :file=>"/ELK/logstash/vendor/bundle/jruby/
1.9/gems/logstash-core-2.0.0-java/lib/logstash/pipeline.rb", :line=>"89", :metho
d=>"run"}←[0m
Logstash startup completed
←[36m_discover_file_glob: /ELK/dbo/api/ups1.log: glob is: ["/ELK/dbo/api/ups1.lo
g"] {:level=>:debug, :file=>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/filewatc
h-0.6.5/lib/filewatch/watch.rb", :line=>"132", :method=>"_discover_file"}←[0m
←[36m_discover_file_glob: /ELK/dbo/api/ups1.log: glob is: ["/ELK/dbo/api/ups1.lo
g"] {:level=>:debug, :file=>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/filewatc
h-0.6.5/lib/filewatch/watch.rb", :line=>"132", :method=>"_discover_file"}←[0m
←[36m_discover_file_glob: /ELK/dbo/api/ups1.log: glob is: ["/ELK/dbo/api/ups1.lo
g"] {:level=>:debug, :file=>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/filewatc
h-0.6.5/lib/filewatch/watch.rb", :line=>"132", :method=>"_discover_file"}←[0m
←[36m_discover_file_glob: /ELK/dbo/api/ups1.log: glob is: ["/ELK/dbo/api/ups1.lo
g"] {:level=>:debug, :file=>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/filewatc
h-0.6.5/lib/filewatch/watch.rb", :line=>"132", :method=>"_discover_file"}←[0m
←[36m_discover_file_glob: /ELK/dbo/api/ups1.log: glob is: ["/ELK/dbo/api/ups1.lo
g"] {:level=>:debug, :file=>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/filewatc
h-0.6.5/lib/filewatch/watch.rb", :line=>"132", :method=>"_discover_file"}←[0m
←[36m_discover_file_glob: /ELK/dbo/api/ups1.log: glob is: ["/ELK/dbo/api/ups1.lo
g"] {:level=>:debug, :file=>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/filewatc
h-0.6.5/lib/filewatch/watch.rb", :line=>"132", :method=>"_discover_file"}←[0m
←[36m_discover_file_glob: /ELK/dbo/api/ups1.log: glob is: ["/ELK/dbo/api/ups1.lo
g"] {:level=>:debug, :file=>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/filewatc
h-0.6.5/lib/filewatch/watch.rb", :line=>"132", :method=>"_discover_file"}←[0m
←[36m_discover_file_glob: /ELK/dbo/api/ups1.log: glob is: ["/ELK/dbo/api/ups1.lo
g"] {:level=>:debug, :file=>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/filewatc
h-0.6.5/lib/filewatch/watch.rb", :line=>"132", :method=>"_discover_file"}←[0m
←[36m_discover_file_glob: /ELK/dbo/api/ups1.log: glob is: ["/ELK/dbo/api/ups1.lo
g"] {:level=>:debug, :file=>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/filewatc
h-0.6.5/lib/filewatch/watch.rb", :line=>"132", :method=>"_discover_file"}←[0m
←[36m_discover_file_glob: /ELK/dbo/api/ups1.log: glob is: ["/ELK/dbo/api/ups1.lo
g"] {:level=>:debug, :file=>"/ELK/logstash/vendor/bundle/jruby/1.9/gems/filewatc
h-0.6.5/lib/filewatch/watch.rb", :line=>"132", :method=>"_discover_file"}←[0m`

Looks like logstash tries to do smth, but couldn't.

Hello Lallartu,

That should work.

Could you confirm that the file /ELK/dbo/api/ups1.log exists, and that you have removed the correct sincedb file before starting?

Cheers,
-Robin-

Hello,
your right :slight_smile: After changing ups1.log path to c:/ELK/dbo/api/ i've got result.txt

Thank you so musch

hi... please give me solution as i got same problem.
/opt/logstash/bin/logstash -f /etc/logstash/conf.d/simple.conf
Settings: Default pipeline workers: 1
Logstash startup completed

my config file is

input {
file {
path => "/etc/table.csv"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["Date","Open","High","Low","Close","Volume","Adj Close"]
}
mutate {convert => ["High", "float"]}
mutate {convert => ["Open", "float"]}
mutate {convert => ["Low", "float"]}
mutate {convert => ["Close", "float"]}
}
output {
elasticsearch {
action => "index"
hosts => "localhost"
index => "sto"
}
stdout {}
}

sarbjeet,

Do you really have the same problem?

In the original post, Lallartu had the path to the file wrong. /ELK/dbo/api/ups1.log instead of c:/ELK/dbo/api/ups1.log

If you have the same symptoms, I think you should follow what was suggested before.

If doing that does not work for you, you should create a new post with as much detail as this one originally had - in particular the console output when you run Logstash with --debug. You should also mention what version of Logstash you are using.

file path is correct . i also deleted .sincedb_*.but the problem is same. i use logstash 2.2 version.
after debug out put is repeated again and agin.it could not stop.
Flushing buffer at interval {:instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x2dd2adaa @operations_mutex=#Mutex:0x23154660, @max_size=500, @operations_lock=#Java::JavaUtilConcurrentLocks::ReentrantLock:0x272ac6da, @submit_proc=#Proc:0x5aaf0249@/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.5-java/lib/logstash/outputs/elasticsearch/common.rb:57, @logger=#<Cabin::Channel:0x5300693a @metrics=#<Cabin::Metrics:0x226d7996 @metrics_lock=#Mutex:0x207b314, @metrics={}, @channel=#<Cabin::Channel:0x5300693a ...>>, @subscriber_lock=#Mutex:0x20985457,

flushing buffer at interval {:instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x2dd2adaa @operations_mutex=#Mutex:0x23154660, @max_size=500, @operations_lock=#Java::JavaUtilConcurrentLocks::ReentrantLock:0x272ac6da, @submit_proc=#Proc:0x5aaf0249@/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.5-java/lib/logstash/outputs/elasticsearch/common.rb:57, @logger=#<Cabin::Channel:0x5300693a @metrics=#<Cabin::Metrics:0x226d7996 @metrics_lock=#Mutex:0x207b314, @metrics={}, @channel=#<Cabin::Channel:0x5300693a ...>>, @subscriber_lock=#Mutex:0x20985457, @level=:debug, @subscribers={12320=>#<Cabin::Outputs::IO:0x3969d698 @io=#<IO:fd 1>, @lock=#<Mutex:0x6494d,

actually i want to run pcap file in ELK.but first i try .csv file.it run first time completely but next time it does not display output.

/opt/logstash/bin/logstash -f /etc/logstash/conf.d/simple.conf
Settings: Default pipeline workers: 1
Logstash startup completed
^CSIGINT received. Shutting down the pipeline. {:level=>:warn}
Logstash shutdown completed