Is there any way to send dynamic log file (which are always changing, adding new lines, modyfing old ones etc ...) into elasticsearch to work on with kibana (visualize, dashboard etc ...) ?
My problem is that I have a dynamic file with log entries, and there're changing always. For example :
a ticket with an id (never change) and a number of events, but this number can grow every minutes.
I'm not sure why a log would modify an already logged line, sounds like tampering. I'm interested on hearing your scenario if you want to share the details.
Out of the box I don't think modifying already logged lines is supported, filebeat uses a registry to track the offset of last processed lines per file, which does not fit modifications.
If you create a script that watches for your log file and outputs new lines + modified lines to a new file, we would be able to ingest it.
I have logs which, for an ID, contains some informations.
These informations can be updated at any time. For example, for the id 712, at time 0, there are 700 events, but after a few minutes, there are 1000+ events.
The log file is updated with a script, but the problem is that the inode of the file is changing so it's not possible to use logstash, because in this case I have duplicated lines in Kibana. I was wondering if there are any possibilities to ingest these informations and update it automatically, to be able to work on it with kibana without duplications.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.