C:\Users\bhagupta\elk_poc\logstash-2.3.2\logstash-2.3.2\bin>logstash -f logstash.conf
io/console not supported; tty will not be manipulated
Settings: Default pipeline workers: 4
Note : After this nothing get displayed on console.
Below are the contents from my logstash.conf file
input {
file {
path => ["C:/Users/bhagupta/elk_poc/output.log"]
type => "apache"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {}
}
I guess it's unable to read the content from output.log file. Can somebody help me in resolving this problem?
You can ignore the "io/console not supported; tty will not be manipulated" message.
What's the modification time of the input file (output.log)? If older than 24 hours, make sure you adjust the file input's ignore_older
setting.
I'm trying to parse my application logs using logstash filters. The log file contents are like below
17 May 2016 11:45:53,391 [tomcat-http--10] INFO com.visa.vrm.aop.aspects.LoggingAspect - RTaBzeTuarf |macBook|com.visa.vrm.admin.controller.OrgController|getOrgs|1006
I'm trying to create a dashboard (line chart) using logstash and want to show the activities on it. For e.g request comes in from some server with correlation id and have to see which class it calls with corresponding method and how long it took to execute.
The message is like
correlation id | server-name | class name | method name | time taken
log file e.g
RTaBzeTuarf |macBook|com.visa.vrm.admin.controller.OrgController|getOrgs|1006
Do I've to create grok filters or I've to define some pattern ? Please advise
Yes, you need a grok filter to extract discrete fields from your logs. You probably won't find a singular pre-cooked grok pattern that's usable out of the box, but there are certainly building blocks that you can put together.