I'm finalising a deployment for a tenant based system whereby there are various systems pushing logs through logstash.
Having filebeat set document_type based on prospectors seems like a bad idea. What if people don't set this correctly, or make some mistake? Is it overly taxing to have logstash handle this, adding the document_type as necessary, and relying on grok's to handle the heavy lifting?
I feel like leaving the responsibility of setting document_type to filebeat is a bad idea.
Would love to hear peoples opinions.
Having filebeat set document_type based on prospectors seems like a bad idea. What if people don't set this correctly, or make some mistake?
Then they are going to have a bad time and will probably discover their mistake pretty soon.
Is it overly taxing to have logstash handle this, adding the document_type as necessary, and relying on grok's to handle the heavy lifting?
Depending on how you write your filters it may cost more to attempt to autodetect the kind of event.
The main reason I require log client to pre-declare the type of events they're sending is because it makes it easier to write the filters and I never have to worry about making sure logs can be unambiguously parsed.
What if you have two clients sending JSON logs, for example? It's trivial to detect that a log most likely is JSON and should be sent through a json filter, but then what? Are you going to look at the presence of certain sentinel fields to determine how to process the event? I'm not saying it's impossible, but I wouldn't want to do it.
Cheers. Thats a perfect answer