As I was working through Ricardo's Youtube video I ended with the below below question, I posted them in the Video comment section but thinking might get more traction reposting here.
I'm a total noob wrt EK still, and putting together the bits whats possible and not, whats good ideas and what is not.
#1
a technical question, the prospector's look for new files, is this based on name or a inode.. as with file rotation todays file is compressed and renamed tonight and a new file is then created with the same name, which implies the registry entry needs to be reset to line 0.
#2
hhehee, apologies for all the questions, noticed you also on a MAC, noticed you not doing a sudo on each command (re filebeat), did you change the ownership, allowing filebeat to operate, or did you do a sudo su - when were not lookin, as when you created new files you also never modified permissions.
#3
adding to the structured event, #1 you want to extract the main start and end as a event, what if the main "loop" include sub loops that you want to show thenself. thin a large batch starting with a start and end, but inside the large batch you have multiple looping processes that you want to show as they cycle, (and not wait for the main batch start/end ) to complete.
... for structured events, if the start and end includes a event id, can they be associated with each other,
in the current form of your example it plays to a batch process starting and ending, not to many transactions that can end being interlaced ?
#4
question, when shipping via kafka, how can you execute the kibana configuration, thinking you might have a setup where the sources (*beats) then only have access to the kafka brokers and not the elasticsearch or Kibana server / Network layers/subnets/firewalls... etc.
#5
... with one filebeat process running,I see we can specify the topic, based on a "when" clause, and I noticed to you can include a kafka message key (helping make sure all messages for a key (maybe message per file) is in same order on a topic (localised to a partition), question, in a scenario where I say don't want to use a kafka key, can I then split the output to different topics (or even indexes) based on the originating input file,
#6
... Hoping there is similar AuditBeat, PacketBeat, MetricBeat, WinLogBeat videos... if YES, please update the video text with links to them
#7
with heroes 04 example ... you pulled the config into a separate filebeats.yml file. this imply you will run 2 processes, or can you pull this into the main file, with this file still going to it's own idex/pipeline, and the other /var/log/*.log's index...
just thinking, you might have multiple files in the same directory, and you want each to go into it's own index, some single line, some multi line, some structured etc, ... expanding on this... i might want to have a single filebeat.yml processing running, but push each source log onto it's own kafka topic, to be then pushed via a Kafka Connector to it's own index.
G