Logstash vs building a custom data stream

Thought i would propose a question to the "guru's" and see what the general thought would be......

I have a stream of data coming from some proprietary software of "transactions" which i want to put into ES. now this stream is through a java plugin which i write and control. In theory i can do almost anything with it. The two possible options i have come up with

1 take the data and use log4J and write to a log file. then from there i use logstash with FileBeat (filters etc) to get that data loaded into transactions

Pros: if logstash stops/ starts i dont lose any logs as it starts from where it last processed
Cons: extra puzzle pieces to break and have to get working.
more disk IO and resources being used to write/ read the log file

#2 in my plugin i use the elasticsearch rest api and or bulk api to push straight into ES
Pros : less to break and easier on resources (i think)
Cons: if ES goes down i will lose data in the stream but thats not "critical" to me as data is being used for a monitoring perspective so i don't need 100% of the data. Loss is acceptable.

On the volume side of things its quite low for ES at around 1 million records a day......
Interested to see what anyone thinks of my little problem :slight_smile:

-Julian

I'd go with #1. It means that you can also change where you send things in future with no changes to the java app.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.