I made an app that uses Nlog to log into a file. Then I added the functionality within the app to download Filebeat and ship the logs to a central Elastic Stack. I thought about Nlog.Target.Elasticsearch, but I like the fact that Filebeat is its own process.
Then I saw that some element in the chain adds its timestamp, and that is the only Date type i can index. So I tried to convert my datetime from the log, and that is when the pain started.
As I understand, there are two ways to do this:
- redirect my Filebeat to Logstash, which would use the Date filter,
- add a pipeline which would use the Date filter to Elasticsearch, and redirect Filebeat to it.
I tried to go with the second option because then I skip Logstash, which feels cleaner. But when I tried to find documentation about adding a pipeline, i only found this link:
and there it says that this feature is experimental and may be removed.
And now I ask myself, is my scenario really that strange that I have to go into experimental features? Why is it that hard to put the timestamp from my log into Elastic Stack? Isn't it critical to have the most accurate time in the index? Maybe I'm missing something?
I would expect that Filebeat has a Date processor, so that this critical information is available right from the start, but I guess I'm wrong. Can I format my date in Nlog so that Filebeat recognizes it as a date type?
Should I resort to Logstash to fix the timestamp?
Thanks! My topic is a bit critical, but I love everything else about Elastic Stack