Logstash grok

hi everyone,
my test.log contents like bellow:"AsynExecutor.contractSyncScheduler>>>> 2016-03-17 04:00:03,247 INFO [cn.cmri.pds.neusoft.ContractSyncService] - <【PageInquiryAccountBalanceContectSrv】,success>"

my conf file as bellow:
"input {
file{

    path =>["/opt/pds/log/test.log"]
    type => "pds"

}
}
filter {
if [type = pds ]{
grok {
match => { "pds" => "%{DATA:scheduler}>>>> %{DATA:date} %{DATA:loglevel} %{DATA:service}-<%{DATA:Account},%{DATA:status}>" }
}
}
}
output{
stdout { codec => rubydebug }
elasticsearch{
hosts =>["localhost:9200"]
index => "test_log"
}
}
I can not find anything in debug info.BTW, the grok.conf is no error in running. anybody can help? thanks!

Best Regards,
Levi

Don't use more than one DATA pattern in the same expression. Use more exact patterns.

Is Logstash at least processing new events from the file, just not grokking them correctly?

hi magnus,
I change the input stdin and the result like this
Logstash startup completed
{
"message" => "AsynExecutor.contractSyncScheduler>>>> 2016-03-17 04:00:03,247 INFO [cn.cmri.pds.neusoft.ContractSyncService] - <【PageInquiryAccountBalanceContectSrv】,success>",
"@version" => "1",
"@timestamp" => "2016-03-18T07:32:26.239Z",
"host" => "ubuntu",
"tags" => [
[0] "_grokparsefailure"
]
}
this is not I want to use.

I change the grok like bellow:
%{exact:scheduler}>>>> %{DATESTAMP:date} %{exact:LOGLEVEL} %{exact:service}-<%{exact:Account},%{exact:status}>

I running and the result is error:
Settings: Default pipeline workers: 4
The error reported is:
pattern %{exact:scheduler} not defined

How can I using the exact patterns corecctly? thanks!

yes,it not grokking them correctly!
[0] "_grokparsefailure"

You can test filter in http://grokdebug.herokuapp.com/
Have fun!

I didn't mean that you should use a pattern named "exact". I meant that you should use patterns that are more exact. For example, use a timestamp pattern like TIMESTAMP_ISO8601 to match your timestamp and the LOGLEVEL pattern to match your log level. See https://github.com/logstash-plugins/logstash-patterns-core/tree/master/patterns for the available patterns (the most commonly used ones are in the grok-patterns file).

I test the patterns %{DATESTAMP:logTime} %{LOGLEVEL:info} for " 2016-03-17 04:00:03,247 INFO" which has passed!

the left I have no ide
the "AsynExecutor.contractSyncScheduler>>>> passed [cn.cmri.pds.neusoft.ContractSyncService] - <【PageInquiryAccountBalanceContectSrv】,success>
"

thanks for tatdat and magnus.

I use patterns %{GREEDYDATA :message} for the line message which has passed,

the conf file like bellow:

input {
file{

    path =>["/opt/pds/log/test.log"]
    type => "pds"

}
}
filter {
if [type = pds ]{
grok {
match => { "pds" => "%{GREEDYDATA :message}" }
}
}

}
output{
stdout { codec => rubydebug }
elasticsearch{
hosts =>["localhost:9200"]
index => "test_log"
}
}
when I run ./logstash -f conf/grok.conf, I can't found the index at the directory /usr/pds/elasticsearch-2.2.0/data/elasticsearch/nodes. any help?

when I run ./logstash -f conf/grok.conf, I can't found the index at the directory /usr/pds/elasticsearch-2.2.0/data/elasticsearch/nodes. any help?

Use the APIs to list indexes. Don't inspect the file system.

Is new data being added to test.log? Or do you expect Logstash to run the file from the beginning? If yes, read about the file input's start_position option.

I forget to insert the data to the test.log! thanks for your suggestion!

Hi Magnus,
I have just started exploring Logstash.
Currently trying to load a log file to elasticsearch, but it's taking a long time to create index.
Please help me understand how can we tune the config of the logstash.

File content is :
INFO : LM_36435 [Thu Apr 21 23:27:16 2016] : (20731|-1449096896) Starting execution of workflow [wf_test] in folder [Folder1] last saved by user [admin].

Config file :

input {

file {
path => "/logstash-2.1.1/conf/Infa_log/wf_test.log"
start_position => "beginning"
type => "infa_logs"
}
}

filter {

            grok {
                    match => [ "message","%{WORD:Severity} : %{WORD:Message_code} \[%{DAY:Day} %{MONTH:Month} %{MONTHDAY:Day_of_Month} %{HOUR:Hour}:%{MINUTE:Min}:%{SECOND:Sec} %{YEAR:Year}\] : \(%{NOTSPACE:Num}\) %{GREEDYDATA:Message}"
                    ]
                    }

}

output {

elasticsearch{
hosts => ["localhost:9200"]
index => "infa_log"
}

}

Regards,
Asrar

@pandith_asrar, please start a new thread for your unrelated question.