Logstash Index creation and parsing custom app log

I have installed Logstash 2.3.1, elastic search 2.3.1 and Kibana 4.5.0 in windows 2008 server as services.
I can launch Kibana but unable to create Index in Kibana , Refer the log below.
I have verified the below config using command "logstash -f logstash.conf --configtest" and getting Configuration ok. Please suggest why index is not getting created and shown in Kibana.

input {
file {

  path => "E:/logstash/New/Logsetup/log/system.day20160518.log"
  start_position => "beginning"
  type => "logs"
}

}

filter {
grok {
match => { "message" => "%{DATESTAMP:StartTime} %{WORD:code1}-%{WORD:code2} %{WORD:code3} %{WORD:code4} : %{GREEDYDATA:log_message}"}
add_field => [ "StartTime", "%{@timestamp}" ]
}

}

output {
elasticsearch {
type => "logs"
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
}
}

This line should not be in your elasticsearch output block configuration, and should actually result in an error (at the very least, a deprecation message):

And these lines are unnecessary as they are the default values:

Also, if you ran with this statement:

more than one time, it will not re-read the file, even with start_position => "beginning". This is because there is a sincedb file created that wants to tail and resume where it left off. For re-reading old log files that will not receive any more data, you can get them to reread by adding sincedb_path => NUL (NUL is like /dev/null for Windows machines).

Thanks for your reply. I have altered the config file as below but no go.

input {
file {

  path => "E:/logstash/New/Logsetup/log/system.day20160518.log"
  sincedb_path => "/dev/null"
  start_position => "beginning"
  type => "logs"
}

}

filter {
grok {
match => { "message" => "%{DATESTAMP:StartTime} %{WORD:code1}-%{WORD:code2} %{WORD:code3} %{WORD:code4} : %{GREEDYDATA:log_message}"}
add_field => [ "StartTime", "%{@timestamp}" ]
}

}

output {
elasticsearch {

}

}

As mentioned, /dev/null is for UNIX systems. You have a Windows path, so you should be using sincedb_path => NUL.

You should also test the output by changing the output block to be:

output {
  stdout { codec => rubydebug } 
#  elasticsearch { }
}

With the elasticsearch output disabled, by being commented. You will be able to see output at the command-line this way. If you do not get any output, that will also indicate why the index is not being created.

When I try debugging in command line , getting the below message.

E:\logstash\New\logstash-2.3.1\bin>logstash -f E:\logstash\New\logstash-2.3.1\bi
n logstash.json --debug

T_HANDLER:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:na
gios_state};%{DATA:nagios_statelevel};%{DATA:nagios_event_handler_name}", :level
=>:info, :file=>"/logstash/New/logstash-2.3.1/vendor/bundle/jruby/1.9/gems/jls-g
rok-0.11.2/lib/grok-pure.rb", :line=>"62", :method=>"add_pattern"}←[0m
←[32mAdding pattern {"NAGIOS_HOST_EVENT_HANDLER"=>"%{NAGIOS_TYPE_HOST_EVENT_HAND
LER:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_sta
televel};%{DATA:nagios_event_handler_name}", :level=>:info, :file=>"/logstash/Ne
w/logstash-2.3.1/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.2/lib/grok-pure.rb",
:line=>"62", :method=>"add_pattern"}←[0m
←[32mAdding pattern {"NAGIOS_TIMEPERIOD_TRANSITION"=>"%{NAGIOS_TYPE_TIMEPERIOD_T
RANSITION:nagios_type}: %{DATA:nagios_service};%{DATA:nagios_unknown1};%{DATA:na
gios_unknown2}", :level=>:info, :file=>"/logstash/New/logstash-2.3.1/vendor/bund
le/jruby/1.9/gems/jls-grok-0.11.2/lib/grok-pure.rb", :line=>"62", :method=>"add_
pattern"}←[0m
←[32mAdding pattern {"NAGIOS_EC_LINE_DISABLE_SVC_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL
_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_SVC_CHECK:nagios_command};%{DATA:nagi
os_hostname};%{DATA:nagios_service}", :level=>:info, :file=>"/logstash/New/logst
ash-2.3.1/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.2/lib/grok-pure.rb", :line=

"62", :method=>"add_pattern"}←[0m
←[32mAdding pattern {"NAGIOS_EC_LINE_DISABLE_HOST_CHECK"=>"%{NAGIOS_TYPE_EXTERNA
L_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_HOST_CHECK:nagios_command};%{DATA:na
gios_hostname}", :level=>:info, :file=>"/logstash/New/logstash-2.3.1/vendor/bund
le/jruby/1.9/gems/jls-grok-0.11.2/lib/grok-pure.rb", :line=>"62", :method=>"add_
pattern"}←[0m
←[32mAdding pattern {"NAGIOS_EC_LINE_ENABLE_SVC_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_
COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_SVC_CHECK:nagios_command};%{DATA:nagios
_hostname};%{DATA:nagios_service}", :level=>:info, :file=>"/logstash/New/logstas
h-2.3.1/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.2/lib/grok-pure.rb", :line=>"
62", :method=>"add_pattern"}←[0m
←[32mAdding pattern {"NAGIOS_EC_LINE_ENABLE_HOST_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL
_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_HOST_CHECK:nagios_command};%{DATA:nagi
os_hostname}", :level=>:info, :file=>"/logstash/New/logstash-2.3.1/vendor/bundle
/jruby/1.9/gems/jls-grok-0.11.2/lib/grok-pure.rb", :line=>"62", :method=>"add_pa
ttern"}←[0m

This seems incomplete. The Adding pattern lines aren't helpful and do not indicate any errors. Can you filter those out?

and in the same grok block:

Is StartTime getting its data from grok, or from @timestamp?

App log file will look like , so used grok timefield based on log.

2016/08/09 00:43:38.169 S-300000 text text : ssytem error.

Updated conf without grok and getting the below error

input {
file {

  path => "E:/logstash/New/Logsetup/log/system.day20160518.log"
  sincedb_path => "Nul"
  start_position => "beginning"
  type => "logs"
}

}

output {
stdout { codec => rubydebug }
elasticsearch { }
}

←[36mconfig LogStash::Inputs::File/@delimiter = "\n" {:level=>:debug, :file=>"/l
ogstash/New/logstash-2.3.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.1-java
/lib/logstash/config/mixin.rb", :line=>"153", :method=>"config_init"}←[0m
←[36mconfig LogStash::Inputs::File/@ignore_older = 86400 {:level=>:debug, :file=

"/logstash/New/logstash-2.3.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.1-
java/lib/logstash/config/mixin.rb", :line=>"153", :method=>"config_init"}←[0m
←[36mconfig LogStash::Inputs::File/@close_older = 3600 {:level=>:debug, :file=>"
/logstash/New/logstash-2.3.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.1-ja
va/lib/logstash/config/mixin.rb", :line=>"153", :method=>"config_init"}←[0m
←[36mPlugin not defined in namespace, checking for plugin file {:type=>"output",
:name=>"stdout", :path=>"logstash/outputs/stdout", :level=>:debug, :file=>"/log
stash/New/logstash-2.3.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.1-java/l
ib/logstash/plugin.rb", :line=>"76", :method=>"lookup"}←[0m
←[32mstarting agent {:level=>:info, :file=>"/logstash/New/logstash-2.3.1/vendor/
bundle/jruby/1.9/gems/logstash-core-2.3.1-java/lib/logstash/agent.rb", :line=>"2
07", :method=>"execute"}←[0m

If the modification time of system.day20160518.log is older than 24 hours you need to adjust the file input's ignore_older option.

I tried after changing the DATESTAMP with current date as a file name and also the DATESTAMP
inside the log file also.

Even though i am getting a same error as i posted below:

E:\logstash\New\logstash-2.3.1\bin>logstash -f logstash.json --debug
io/console not supported; tty will not be manipulated
←[36mReading config file {:config_file=>"E:/logstash/New/logstash-2.3.1/bin/logs
tash.json", :level=>:debug, :file=>"/logstash/New/logstash-2.3.1/vendor/bundle/j
ruby/1.9/gems/logstash-core-2.3.1-java/lib/logstash/config/loader.rb", :line=>"6
9", :method=>"local_config"}←[0m
←[36mPlugin not defined in namespace, checking for plugin file {:type=>"input",
:name=>"file", :path=>"logstash/inputs/file", :level=>:debug, :file=>"/logstash/
New/logstash-2.3.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.1-java/lib/log
stash/plugin.rb", :line=>"76", :method=>"lookup"}←[0m
←[36mPlugin not defined in namespace, checking for plugin file {:type=>"codec",
:name=>"plain", :path=>"logstash/codecs/plain", :level=>:debug, :file=>"/logstas
h/New/logstash-2.3.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.1-java/lib/l
ogstash/plugin.rb", :line=>"76", :method=>"lookup"}←[0m
←[36mconfig LogStash::Codecs::Plain/@charset = "UTF-8" {:level=>:debug, :file=>"
/logstash/New/logstash-2.3.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.1-ja
va/lib/logstash/config/mixin.rb", :line=>"153", :method=>"config_init"}←[0m
←[36mconfig LogStash::Inputs::File/@path = ["E:/logstash/New/Logsetup/log/system
.day20160810.log"] {:level=>:debug, :file=>"/logstash/New/logstash-2.3.1/vendor/
bundle/jruby/1.9/gems/logstash-core-2.3.1-java/lib/logstash/config/mixin.rb", :l
ine=>"153", :method=>"config_init"}←[0m
←[36mconfig LogStash::Inputs::File/@sincedb_path = "Nul" {:level=>:debug, :file=

"/logstash/New/logstash-2.3.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.1-
java/lib/logstash/config/mixin.rb", :line=>"153", :method=>"config_init"}←[0m
←[36mconfig LogStash::Inputs::File/@start_position = "beginning" {:level=>:debug
, :file=>"/logstash/New/logstash-2.3.1/vendor/bundle/jruby/1.9/gems/logstash-cor
e-2.3.1-java/lib/logstash/config/mixin.rb", :line=>"153", :method=>"config_init"
}←[0m

I wouldn't even bother with grok before ensuring that data flows. You can comment out the entire filter block with # on each line. If you don't see data flowing, chasing grok is not going to help.

Try the ignore_older suggestion from @magnusbaeck. I'm also not sure whether Nul and NUL are the same to Windows, so I suggest making Nul into NUL.

I have my log file name like 00331_ln2_systemday20160814 , so when creating a index I want to add 2 new fields like store no from file name "00331" and Laneno as "ln2". Refer my grok file below.
Help me to add add_fields.

filter {
grok {
match => { "message" => "%{DATESTAMP:timestamp} %{WORD:code1}-%{WORD:code2} %{WORD:code3} %{WORD:code4} : %{GREEDYDATA:log_message}" }
break_on_match => false
}
}

You'll find the path to the file in the path field, so just add a grok filter that extracts the desired fields from that field.

Thanks magnus for quick reply. Do you mean the below line is correct? . Path has full file path, so is there any param for filename?

Please share me some example.

add_field => {"Store_num","%{GREEDYDATA}/%{GREEDYDATA:path}"}

Do you mean the below line is correct?

No. For starters it's not a grok filter.

Path has full file path, so is there any param for filename?

grok {
  match => {
    "path" => "/(?<filename>[^/]+)$"
  }
}

I gave this exact example and explained how it worked in another thread just a few days ago.