We use Flume at Gigya to ship logs and application events into
Elasticsearch (and kibana).
We're using the flume-ng elasticsearch sink, but we've encountered a few
issues with it. So we created an extended version of the logstash
serializer that the sink uses, that fixes some bugs contained in the
original one, and adds some features that we needed (like removing the
@fields node level, collecting objects and generating a consistent document
ID per event).
Hoping that some of you may find it useful, you can find the sources for
the extended serializer here -
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to firstname.lastname@example.org.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/24e4792b-f5a4-4b30-ac66-64487064745c%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.