Hello all... I am very new to ELK, I have what may be a stupid question but I've searched and searched and without any luck. Apologies if this is in some FAQ I have overlooked.
I have many exciting use-cases I am working on, but what I thought would be the simplest is giving me the most trouble.
Simple put, this use-case is to centralize and make available a distributed application's logs.
I do not need any search capability whatsoever, besides letting a user enter a start date, end date and returning all the log entries for that particular time period (no need to even aggregate logs, or search across them, although that may come later).
What is challenging me is the log format, or lack thereof... they are a mess... they contain a mix of log4j style entries with a nice timestamp, but with stack-traces and xml messages dumped out into the log periodically as well. What is getting me is the stack-traces and xml messages are multi-line... and they DO NOT contain time-stamps.
I could see pre-processing the logs in such a way that I pre-pend a time-stamp to each line whether it is there or not but have the following concerns:
- I'd rather not
- The timestamps are only down to the second, which means I'll likely have many duplicates... I am concerned that when logs are search that the records come out in the same order they were written to the log originally.
Before I got into the tech.. I assumed this would be the lay-up use-case... and perhaps I am just being thick and it still is... I am being flip above when I say "I'd rather not", I guess what I mean is I'd thought not much would be necessary to fulfill this use-case but shipping the logs and getting them pumped into esearch... but I am a bit flummoxed right now.
Apologies in advance if this is super-basic, flame-away if that's the case...