Grok for code to get formatted output

#My filebeat0.yml

filebeat:

prospectors:

paths:
- c:/Users/akashsoni/Downloads/k/logstash-tutorial/tutorial/.log
- c:/Users/akashsoni/Downloads/k/logstash-tutorial/tutorial/
.err

input_type: log

registry_file: "C:/ProgramData/filebeat/registry"

output:

logstash:

 hosts: ["localhost:5044"]

shipper:

logging:

files:

 rotateeverybytes: 10485760 # = 10MB

#my logstash.conf

input {
beats {
port => 5044
}
}
filter {
grok {
**#what should i return here so i will get 'Exception','warning','Error',likes word in my log ,so **
** #only those file who has 'Exception','warning','Error',likes word in log **
# have to store it to my elasticsearch ,otherwise not
}
}
output {
elasticsearch {
hosts => "localhost:9200"
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}

}


My log files are of different format one is of java and other is of nginx store at single folder one of log file is having size of 19.2gb.
#My Java log :
javax.mail.internet.AddressException: Missing local name in string ``@gmail.com''
at javax.mail.internet.InternetAddress.checkAddress(InternetAddress.java:1209)

So you want to capture events with "exception", "warning", or "error" anywhere in the message and send them to Elasticsearch and drop all other events?

yup, it is possible through logstash?

yes tell me the code and where to apply those code to achieve required goal .

sir, will you help me on that?

Please have patience, we are all volunteers here and someone will respond to you when they can.

okay sir,

The easy way I see to have only messages containing 'Exception','warning','Error' sent to elasticsearch :

output {
  if [message] =~ "Exception|warning|Error" {
    elasticsearch {
      hosts => "localhost:9200"
      manage_template => false
      index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
      document_type => "%{[@metadata][type]}"
    }
  }
}

So need to grok filter

thankyou sir,

But how to grok to use this output code?

Here the configuration you can do :

filter {
  grok {
    match => { "message" => "(?<message_type>Exception|warning|Error)" }
  }
  
  if ![message_type] {
    drop {}
  }
}

thank you sir for giving yours valuable time :slight_smile:

PS C:\Program Files\Filebeat> ./filebeat -e -c filebeat0.yml -d "publish"
2016/05/05 07:26:16.287762 geolite.go:24: INFO GeoIP disabled: No paths were set under output.geoip.paths
2016/05/05 07:26:16.287762 logstash.go:106: INFO Max Retries set to: 3
2016/05/05 07:26:16.292766 outputs.go:126: INFO Activated logstash as output plugin.
2016/05/05 07:26:16.292766 publish.go:232: DBG Create output worker
2016/05/05 07:26:16.292766 publish.go:274: DBG No output is defined to store the topology. The server fields might not
be filled.
2016/05/05 07:26:16.292766 publish.go:288: INFO Publisher name: akash
2016/05/05 07:26:16.297770 async.go:78: INFO Flush Interval set to: 1s
2016/05/05 07:26:16.297770 async.go:84: INFO Max Bulk Size set to: 2048
2016/05/05 07:26:16.297770 async.go:92: DBG create bulk processing worker (interval=1s, bulk size=2048)
2016/05/05 07:26:16.298770 beat.go:147: INFO Init Beat: filebeat; Version: 1.2.1
2016/05/05 07:26:16.298770 beat.go:173: INFO filebeat sucessfully setup. Start running.
2016/05/05 07:26:16.299771 registrar.go:66: INFO Registry file set to: C:\Program Files\Filebeat.filebeat
2016/05/05 07:26:16.299771 registrar.go:76: INFO Loading registrar data from C:\Program Files\Filebeat.filebeat
2016/05/05 07:26:16.332015 prospector.go:132: INFO Set ignore_older duration to 0
2016/05/05 07:26:16.332015 prospector.go:132: INFO Set close_older duration to 1h0m0s
2016/05/05 07:26:16.333016 prospector.go:132: INFO Set scan_frequency duration to 10s
2016/05/05 07:26:16.333016 prospector.go:89: INFO Invalid input type set:
2016/05/05 07:26:16.333016 prospector.go:92: INFO Input type set to: log
2016/05/05 07:26:16.333016 prospector.go:132: INFO Set backoff duration to 1s
2016/05/05 07:26:16.333016 prospector.go:132: INFO Set max_backoff duration to 10s
2016/05/05 07:26:16.333016 prospector.go:112: INFO force_close_file is disabled
2016/05/05 07:26:16.333016 prospector.go:132: INFO Set ignore_older duration to 0
2016/05/05 07:26:16.333016 prospector.go:132: INFO Set close_older duration to 1h0m0s
2016/05/05 07:26:16.334016 prospector.go:132: INFO Set scan_frequency duration to 10s
2016/05/05 07:26:16.334016 prospector.go:89: INFO Invalid input type set: err
2016/05/05 07:26:16.334016 prospector.go:92: INFO Input type set to: log
2016/05/05 07:26:16.334016 prospector.go:132: INFO Set backoff duration to 1s
2016/05/05 07:26:16.334016 prospector.go:132: INFO Set max_backoff duration to 10s
2016/05/05 07:26:16.334016 prospector.go:112: INFO force_close_file is disabled
2016/05/05 07:26:16.334016 prospector.go:142: INFO Starting prospector of type: log
2016/05/05 07:26:16.333016 spooler.go:77: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2016/05/05 07:26:16.335018 prospector.go:142: INFO Starting prospector of type: log
2016/05/05 07:26:16.356031 log.go:113: INFO Harvester started for file: c:\Users\akashsoni\Downloads\k\logstash-tutorial
\tutorial\20160402000001.err
2016/05/05 07:26:16.357032 log.go:113: INFO Harvester started for file: c:\Users\akashsoni\Downloads\k\logstash-tutorial
\tutorial\20160402000101.err
2016/05/05 07:26:16.358033 log.go:113: INFO Harvester started for file: c:\Users\akashsoni\Downloads\k\logstash-tutorial
\tutorial\20160402000201.err
2016/05/05 07:26:16.359033 log.go:113: INFO Harvester started for file: c:\Users\akashsoni\Downloads\k\logstash-tutorial
\tutorial\20160402000401.err
2016/05/05 07:26:16.359033 log.go:113: INFO Harvester started for file: c:\Users\akashsoni\Downloads\k\logstash-tutorial
\tutorial\20160402000301.err
2016/05/05 07:26:16.360034 log.go:113: INFO Harvester started for file: c:\Users\akashsoni\Downloads\k\logstash-tutorial
\tutorial\20160402000601.err
2016/05/05 07:26:16.360034 log.go:113: INFO Harvester started for file: c:\Users\akashsoni\Downloads\k\logstash-tutorial
\tutorial\20160402000502.err
2016/05/05 07:26:19.661256 crawler.go:78: INFO All prospectors initialised with 3064 states to persist
2016/05/05 07:26:19.663256 registrar.go:83: INFO Starting Registrar
2016/05/05 07:26:19.664257 publish.go:88: INFO Start sending events to output

This is my filebeat output : courser is blinking continuously but not showing output why ?
when i use same code as i written above in filebeat config file with .log extension it will work fine.
tomorrow this code is running for both err and log why?

Hi,

It is a completely different problem, so I invite you to create a new topic for that.

okay sir, i tried for two days for that grok and you solve it :slight_smile:
from where can learn grok .
thankyou again sir.

You're welcome.

You can learn more about grok here :
https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html