Hi,
So I have been playing with forwarding events from QRadar to the Elastic Stack. I have configured QRadar to forward events to Logstash via TCP in JSON format.
On Logstash I have...
Input:
input {
tcp {
port => 5141
codec => "json_lines"
}
}
Filter:
filter {
if [type] in [ "Event", "Flow" ] {
mutate {
remove_field => [ "name", "version", "isoTimeFormat" ]
rename => { "type" => "offense_type" }
}
mutate {
replace => { "type" => "qradar_json" }
}
date {
match => [ "start_time", "ISO8601" ]
target => "@timestamp"
}
}
}
Output:
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => "localhost:9200"
index => "qradar-%{+YYYY.MM.dd}"
}
}
When starting Logstash with this configuration I get the following errors:
[nioEventLoopGroup-2-1] jsonlines - JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
When looking into the event itself I see:
"message" => "<01>- hostname {\"name\":\"Elastic Data\",\"version\":\"1.0\",\"isoTimeFormat\":\"yyyy-MM-dd'T'HH:mm:ss.SSSZ\",\"type\":\"Event\",\"category\":\"APISuccess\",\"protocolID\":\"255\",\"sev\":\"3\",\...
So it seems the problem is at "<01>" (if I'm reading this correctly) beginning of the message. In Elasticsearch none of the key/value pairs are being extracted into fields but are all in the message field.
Could anyone advise as to how I could fix this? Should I attempt to remove the problematic characters with the filter? Could the codec on the input be the problem?