Running Windows 10, Logstash 8.1.0, Elasticsearch, kibana and filebeat 8.0.0 all on the same machine.
Getting the data from filebeat to kibana works great but the problem is that the whole log is in the field message and we want to have a field for every value in this log.
here is the pipeline which is in C:\Program Files\logstash-8.1.0\first-pipeline.conf:
input {
beats {
port => "5044"
}
}
filter {
grok {
patterns_dir => ["./patterns"]
match => { "message" => "%{DATESTAMP:DateTime},%{PName:Program},%{POSINT:ProcessID},%{POSINT:UsageCPU}" }
}
}
output {
elasticsearch { hosts => ["localhost:9200"]
user => "elastic"
password => "my password"
ssl => true
ssl_certificate_verification => false
# The name of the Index
index => "<logstash_filelog>"
ilm_enabled => false
}
}
Following this guide I have also created the pattern directory where the PName comes from: Grok filter plugin | Logstash Reference [8.0] | Elastic
C:\Program Files\logstash-8.1.0\patterns\filebeatpattern.txt
#regex for program name logfilebeat ex: kibana
PName [A-Za-z]\w+
Here is the structure of the log file:
DateTime, Program, PID, CPU Usage
08.03.2022 14:53:41,Chrome,1715,58
Under Management -> Stack Management I now have this:
The pipeline seems to be working but how do I get to visualize the pipeline in Kibana?
Any help would be appreciated.
Thanks in Advance