Happy new year everyone!
hoping someone can shed some light on this, i have a weird issue i cant set right with parsing a json file.
This is the source json file
{
"SHA256": "766be5c99ba674f985ce844add4bc5ec423e90811fbceer5ec84efa3cf1624f4",
"source": "localhost",
"Msg": "404 OK",
"YaraRule": [
"no_match"
],
"URL": "http://127.0.0.1",
"@timestamp": "2020-01-07T08:59:04",
"raw_msg": "404.19 – Denied by filtering rule",
"filename": "log.txt",
"syntax": "text",
"log_url": "http://127.0.0.1/log.txt",
"MD5": "2c5cddf13ab55a1d4eca955dfa32d245",
"expire": "0",
"user": "user",
"key": "22op3dfe",
"size": 107
}
when i run this log stash conf against it, the data is ingested but the individual lines are ingested as separate docs and not as a single doc in ES.
input {
file {
path => "/opt/data/*"
start_position =>"beginning"
codec => "json_lines"
sincedb_path => "/opt/logreader.sincedb"
}
}
filter {
json {
source => "message"
}
}
output {
elasticsearch {
hosts => ["192.168.136.144:9200"]
index => "log-test-%{+YYYY.MM.dd}"
}
}
So i whipped this up and ran it and nothing is being ingested at all! Yet the logstash logs show no errors,
input {
file {
path => "/opt/data/*"
start_position =>"beginning"
codec => "json_lines"
sincedb_path => "/opt/logreader.sincedb"
}
}
filter{
json {
source => "message"
target => "doc"
add_field => [ "Encryption", "%{[string]}" ]
add_field => [ "source", "%{[string]}" ]
add_field => [ "msg", "%{[WORD]}" ]
add_field => [ "YaraRule", "%{[WORD]}" ]
add_field => [ "status", "%{[WORD][WORD]}" ]
add_field => [ "url", "%{[URL]}" ]
add_field => [ "timestamp", "%{[TIMESTAMP]}"]
add_field => [ "rawmsg", "%{[raw_msg]}" ]
add_field => [ "filename", "%{[filename]}" ]
add_field => [ "syntax", "%{[word]}" ]
add_field => [ "log_url", "%{[URL]}" ]
add_field => [ "MD5", "%{[MD5]}" ]
add_field => [ "expire", "%{[num]}" ]
add_field => [ "user", "%{[USER]}" ]
add_field => [ "key", "%{[key]}" ]
add_field => [ "size", "%{[num]}" ]
}
}
output {
elasticsearch {
hosts => ["192.168.136.144:9200"]
index => "log-test-%{+YYYY.MM.dd}"
}
}
So i guess my question is, is this the right way? or is their an easier way to get the json file ingested as a single document rather than lots of docs per json line ?
Thanks