Hi Everyone !,
So what I'm going to do is indexing .log data with grok using logstash, but in the middle of indexing, it's suddenly stopped. I am already using --debug flag to see what's going on, and it keeps loop this message
[2021-06-18T17:55:39,974][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2021-06-18T17:55:40,131][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2021-06-18T17:55:40,131][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2021-06-18T17:55:41,099][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
Here's my simple config file
input{
file{
path => ["E:/Belajar Ngoding/Elasticsearch/Hadoop.log"]
start_position => "beginning"
sincedb_path => "NUL"
}
}
filter{
grok{
match => {"message" => [' %{DATESTAMP:Time} %{LOGLEVEL:logLevel} %{LOGLEVEL:logMain} {%GREEDYDATA:logMessage}'] }
}
}
output{
elasticsearch{
hosts => "http://localhost:9200"
index => "HadoopLog"
user => "elastic"
password => "sigh"
}
stdout{}
}
Can someone tell me what am i doing wrong ? Thanks