I have different logs from applications, and I want sent it to elasticsearch.
In some cases in logs appear JSON. I want to send it in elasticsearch as plaintext. But in log logstash I see "JSON parse failure":
[2021-07-22T07:08:51,808][ERROR][logstash.inputs.gelf ] JSON parse failure. Falling back to plain-text {:error=>#<LogStash::Json::ParserError: Unexpected character ('f' (code 102)): was expecting comma to separate Object entries
at [Source: (byte[])"{"facility":"fluentd","protocol ...
In input section I use codec => "plain" .
In filter section I not use any JSON modificators.
How I can wholly disable JSON parser for this pipeline?
my logstash config:
input {
gelf {
use_tcp => true
port => 5046
remap => false
tags => ["cicd-k8s-gelf"]
codec => "plain"
}
}
filter {
ruby { code => "event.set('@orig_timestamp' , LogStash::Timestamp.new)" }
if "cicd-k8s-gelf" in [tags] {
ruby {
code => '
t = Time.at(event.get("@orig_timestamp").to_i)
t2 = Time.at(event.get("@timestamp").to_i)
event.set("[@metadata][d_t]", t.strftime("%Y.%m.%d"))
event.set("@diff_time", t-t2)
'
}
}
}
output {
if "cicd-k8s-gelf" in [tags] {
elasticsearch {
hosts => ["ELK_SERVERS"]
index => "INDEX-%{[@metadata][d_t]}"
user => "${USER}"
password => "${PWD}"
manage_template => false
ilm_enabled => false
}
}
}