Process 200 events per second within 100ms

I have a use case in which I have to process 200 events/second and each event should not take more than 100 ms to process.

I have used NIFI for this and it is taking around 500 ms for an event to complete an I am planning to use logstash now.
details:
Input: Kafka topic
output :Json
Data is of fixed length text (2000) an I have to create 30 json fields out of it using filters.

is logstash right choice for this?

What drives this requirement? How are you measuring latency? Where are you outputting the generated JSOn to?

Is the 100ms latency a "hard" requirement, i.e. each event must be completed in less than 100ms or is it a "soft" requirement, i.e. each event should be processed in average in a time less than 100ms?

As Logstash is a Java application, there may be situations (during GC), where an event could take longer than 100ms to be processed. This can be minimized with careful JVM sizing, but I doubt it could be completely avoided.

From a performance, perspective, Logstash can handle this requirements with ease (screenshot taken of 2 parallel LS instances, each with 6 vCPUs and 6GB Heap space handling ~20 Input pipelines, some of them quite complex:

The requirement is like below:
customer makes a transaction and I receive corresponding data in real time. I have to parse the received data based on grok pattern and push to kafka as json. another program will make some business rules and give reply back to customer .

I have to convert received customer data from kafka using log stash filter into json and push to kafka within 100ms.

100 ms is a soft requirement and in worst case I can take upto 200ms.
I have 3 machines each with 32GB RAM and 8 cores to accomplish this.

I do have some complex filtering logic happening in logstash which makes me doubtful about achieving this.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.