Stream Identity in Multiline Codec?

does the multiline codec have any ability to give your own definition of stream identity?? I am feeding logstash from a SINGLE kafka topic using a multiline codec, but some are from different sources. Can I configure the multiline codec with a stream identity such that it stacks logs from different sources separately??

I have been told not to use the multiline filter as it is deprecated.

What's posting the logs into Kafka? You'll always want to do multiline processing as close to the source as possible. While I agree that stream identity support in the multiline codec is desirable it's often the wrong way of solving the problem.

We are feeding our logs into kafka using a tool called logspout which essentially just listens to the stdout of docker containers and ships that data line-by-line to wherever we tell it to. Unfortunately it is an extremely lightweight shipper tool without I think any multiline capabilities. Another option that' I've considered but been unable to make work is to send stack-traces inside of json fields, but I have as of yet been able to make this work, if I send logstash this object:

{ "field1": "value1", "stackTraceField1": "stackTraceLine1\nstackTraceLine2\nstacTraceLine3" }

it unfortunately seems to send those newlines through literally, but so far no luck with this approach either :confused:

I guess a good question would be, is there a way to send multiline messages TO logstash, instead of concatenating them there? One option is to use beats as shippers, I know they have a multiline capability, unfortunately, they do not fit into our current logging framework.