Can I use both json and plain logs in a common filebeat input?

Hi,

I'm trying to send both classic nginx logs (access/error) using filebeat. Which works great.
Now, my application, also spawns logs which are json encoded. Which also works great if I send only these.

It is when I try to combine both of these input to the same logstash, things get complicated. When parsing the application logs, I simply add a codec => json to the input field. However, since the nginx logs are plain text, json parsing fails.

How can I solve this problem? Sending both logs to the same logstash instance?
Can I filter in the input section?

Thanks!

No, but you can use a plain codec in the input and either a) selectively apply a json filter if the line looks like JSON (e.g. begins with a left curly brace) or b) unconditionally apply a json filter and ignore any parsing errors.

I thought the json codec falls back to plaintext if it can't parse properly

I thought the json codec falls back to plaintext if it can't parse properly

Yes, I think you're right. But does it add a tag if there's a failure? I figure one might want to distinguish messages that were JSON from plaintext messages, which the _jsonparsefailure tag added by the json filter would do.

One might also want to check whether the codec logs anything about the parsing failures—with a lot of plaintext messages that could produce a lot of garbage messages in the Logstash log.