Hi,
Im having some issues with trying to get some json into ES.
Im using filebeat to pass the json data over tcp to logstash ... this is a sample of the JSON, the JSON format is correct its just not feeding in the data, i can see what its doing in with the output, but what i see in the debug output isnt what going into ES.
{"classification.taxonomy": "abusive content", "raw": "MS4zMi4xMjguMC8xOCA7IFNCTDI4NjI3NQ==", "feed.accuracy": 100.0, "classification.type": "spam", "feed.provider": "Spamhaus", "feed.url": "https://www.spamhaus.org/", "feed.name": "Spamhaus Drop", "time.source": "2017-12-29T19:49:07+00:00", "time.observation": "2018-01-10T04:05:38+00:00", "extra": "{"blocklist": "SBL286275"}", "source.network": "1.32.128.0/18"}
{"classification.taxonomy": "abusive content", "raw": "NS44LjM3LjAvMjQgOyBTQkwyODQwNzg=", "feed.accuracy": 100.0, "classification.type": "spam", "feed.provider": "Spamhaus", "feed.url": "https://www.spamhaus.org/", "feed.name": "Spamhaus Drop", "time.source": "2017-12-29T19:49:07+00:00", "time.observation": "2018-01-10T04:05:38+00:00", "extra": "{"blocklist": "SBL284078"}", "source.network": "5.8.37.0/24"}
input {
beats {
port => 9515
codec => json
type => data
}
}
filter {
if [type] == "data" {
kv{
}
}
}
output {
if [type] == "data" {
stdout { codec => rubydebug }
elasticsearch { hosts => ["127.0.0.1:9200"]
index => "idata-%{+YYYY.MM}"
}
}
}
Debug output
{
"feed.url" => "https://www.spamhaus.org/",
"feed.provider" => "Spamhaus",
"offset" => 7238348,
"time.observation" => "2018-01-27T01:24:43+00:00",
"input_type" => "log",
"raw" => "MjIzLjE3My4wLjAvMTYgOyBTQkwyMDQ5NTQ=",
"feed.name" => "Spamhaus Drop",
"source" => "/opt/file-output/spamhous.txt",
"source.network" => "223.173.0.0/16",
"type" => "data",
"tags" => [
[0] "beats_input_codec_json_applied"
],
"@timestamp" => 2018-01-31T04:40:40.344Z,
"time.source" => "2018-01-25T14:32:35+00:00",
"classification.type" => "spam",
"extra" => "{"blocklist": "SBL204954"}",
"@version" => "1",
"beat" => {
"name" => "blar",
"hostname" => "blar",
"version" => "5.4.1"
},
"host" => "blar",
"classification.taxonomy" => "abusive content",
"feed.accuracy" => 100.0
}
Kibana
name type format searchable aggregatable excluded controls
_id string
_index string
_score number
_source _source
_type string
So none of the fields have arrived, which means there is no TIME field either so i can only creat an index which have no time reliance.
In Cerebro the entire index is only 810b the file which its importing is over 100megs
any help would be appreciated.
Thanks.