Codec only support first pipeline

I tried to use multi-pipeline to do log collection, parse and output kafka, But the log's encoding are different (i.e: UTF-8 GBK etc).

i tried use pipeline.yml to setup several log inputs and listening 514, 515 udp ports. by defaults, 514 is for utf-8, 515 is for GBK... and i got many parser failure in logstash log, the log said it expect the log encoding is UTF-8.

  • pipeline.id: p514
    pipeline.workers: 1
    pipeline.batch.size: 1
    config.string: "input { syslog{port=>514}} output { stdout { } }"
  • pipeline.id: p515
    queue.type: persisted
    pipeline.workers: 1
    pipeline.batch.size: 1
    config.string: "input { syslog{port=>515 codec=>plain {charset => 'GBK'}}} output{stdout{}}"

and i change the 514/515 order, seems below, it worked after ran again.

  • pipeline.id: p515
    queue.type: persisted
    pipeline.workers: 1
    pipeline.batch.size: 1
    config.string: "input { syslog{port=>515 codec=>plain {charset => 'GBK'}}} output{stdout{}}"
  • pipeline.id: p514
    pipeline.workers: 1
    pipeline.batch.size: 1
    config.string: "input { syslog{port=>514}} output { stdout { } }"

so i guess the codec only support first pipeline, isn't it?

Well that should not happen.

What version of Logstash are you using?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.