I am new to the whole BELK stack (or The Elastic Stack from 5.0.0) so sorry if the question is not well described.
I have logs that looks like the example below that I ship with filebeat to logstash
Log example:
2/24/2016 6:18:20 AM +00:00|FdsResponder|Information|18437|{"ActionName":"ResumeUpload","Exception":null,"Metadata": .. well defined JASON 2/24/2016 6:18:21 AM +00:00|FdsResponder|Information|18436|{"ActionName":"ResumeDownload","Exception":null,"Metadata": .. well defined JASON 2/24/2016 6:20:21 AM +00:00|SDS Responder|Information|0|Loading manifest Guid: .. clear text 2/24/2016 6:20:22 AM +00:00|SDS Responder|Warning|4107|No load priority found for plug-in .. clear text 2/24/2016 6:20:23 AM +00:00|FdsResponder|Information|18437|{"ActionName":"ResumeUpload","Exception":null,"Metadata": .. well defined JASON
Logstash conf:
input { beats { port => 5044 } } output { stdout { codec => rubydebug } }
Filebeat.yml:
`filebeat:
prospectors:
paths:
- D:\filebeat\input*.log
input_type: log
registry_file: "C:/ProgramData/filebeat/registry"
output:
logstash:
hosts: ["10.114.21.91:5044"]
shipper:
logging:
files:
path: D:\filebeat\logs
rotateeverybytes: 10485760 # = 10MB
level: debug`
What I would love to accomplish is to first of all only ship the lines in the log file where the second column = FdsResponder to Logstash. Then have Logstash only output the JSON part of the logfile but using the time/date from the logfile (column 1) as the @timestamp. Is this possible?
Thank you all for your help an input.