Parsing Wazuh archives.json file with Logstash

Hello Team,

Might be json codec would have been discussed in the past , i tried googling and did not find relevant solution . so posting my query to get direct assistance from experts.

I have a file alerts.json and it gets parsed smoothly with codec=json. [ Notable: I have a index template as wazuh-alerts-* ]

then I wanted to parse another json file archives.json which is generated by wazuh events.
I am parsing archives.json with codec = json and using [Type] to segregate both alerts and archives so that i can get it in different indices.
but below configuration gives me parsing error with Json. I get error as ":Json::ParserError: Unexpected character ('' (code 92)): was expecting double-quote to start field name"

what am i missing ? Is it related to json file or indexing template/document type ? or something else ? I am new to logstash parsing.

Wazuh - Logstash configuration file

Local Wazuh Manager - JSON file input

input {
file {
type => "wazuh-alerts"
path => "/var/ossec/logs/alerts/alerts.json"
codec => "json"
}
file {
type => "windows-events"
path => "/var/ossec/logs/archives/archives.json"
codec => "json"
}
}
filter {
if [data][srcip] {
mutate {
add_field => [ "@src_ip", "%{[data][srcip]}" ]
}
}
if [data][aws][sourceIPAddress] {
mutate {
add_field => [ "@src_ip", "%{[data][aws][sourceIPAddress]}" ]
}
}
}
filter {
geoip {
source => "@src_ip"
target => "GeoLocation"
fields => ["city_name", "country_name", "region_name", "location"]
}
date {
match => ["timestamp", "ISO8601"]
target => "@timestamp"
}
mutate {
remove_field => [ "timestamp", "beat", "input_type", "tags", "count", "@version", "log", "offset", "type", "@src_ip", "host"]
}
}
output {
if [type] == "wazuh-alerts"
{
elasticsearch {
hosts => ["localhost:9200"]
index => "wazuh-alerts-3.x-%{+YYYY.MM.dd}"
document_type => "wazuh"
}
} else {
elasticsearch {
hosts => ["localhost:9200"]
index => "windows-events-%{+YYYY.MM.dd}"
document_type => "wazuh"
}
}
}

Yes, it is saying the file does not contain valid JSON.

Thanks Badger. I created a custom index template contain index pattern for both [wazuh-alerts and windows-events ] from wazuh index template.json

And then the logstash was able to parse it.

Not sure how. Now logs are parsed and visualised in Kibana with index i created.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.