Dear Community,
I need your help again. Following situation:
I´ve downloaded the default logstash template with:
curl -XGET '192.168.100.92:9200/_template/logstash?pretty' > logstash-template.json
updated the default logstash template to:
{
"index_patterns" : [
"logstash-*"
],
"settings" : {
"number_of_shards" : 1,**
"number_of_replicas" : 0,**
"index" : {
"refresh_interval" : "5s"
}
},
"mappings" : {
"_default_" : {
"dynamic_templates" : [
{
"message_field" : {
"path_match" : "message",
"match_mapping_type" : "string",
"mapping" : {
"type" : "text",
"norms" : false
}
}
},
{
"string_fields" : {
"match" : "*",
"match_mapping_type" : "string",
"mapping" : {
"type" : "text",
"norms" : false,
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
}
}
}
],
"properties" : {
"@timestamp" : {
"type" : "date"
},
"@version" : {
"type" : "keyword"
},
"geoip" : {
"dynamic" : true,
"properties" : {
"ip" : {
"type" : "ip"
},
"location" : {
"type" : "geo_point"
},
"latitude" : {
"type" : "half_float"
},
"longitude" : {
"type" : "half_float"
}
}
}
}
}
},
"aliases" : { }
}
Especially I added only the number of shards and replicas settings. I stopped logstash and uploaded them and started logstash with following command:
curl -XPUT -H 'Content-Type: application/json' '192.168.100.92:9200/_template/logstash' -d "@logstash-template.json"
directly from the server.
From this time no new specific index has been created. There must be created an index because the source log files were delivered with syslog every few seconds. So new logs are existing.
After this I started logstash in debug mode and tried to figured out why no index is created. The last index is from 2019-01-19 and the new logstamps are from today/now.
In the debug log file I see that logstash reads those source log files:
[2019-01-21T15:31:18,871][DEBUG][logstash.inputs.file ] Received line {:path=>"/LOGS/ASA/10.0.99.254.log", :text=>"2019-01-21T15:26:00+01:00 ASA-IX : %ASA-6-302015: Built outbound UDP connection 64916574 for outside:xxx.xxx.xxx.xxx/xxx (xxx.xxx.xxx.xxx/xxx) to inside:xxx.xxx.xxx.xxx/xxx (xxx.xxx.xxx.xxx/xxxxx)"}
[2019-01-21T15:31:18,871][DEBUG][logstash.util.decorators ] filters/LogStash::Filters::Mutate: adding value to field {"field"=>"syslog_server_domain", "value"=>["number1.at"]}
[2019-01-21T15:31:18,861][DEBUG][logstash.util.decorators ] filters/LogStash::Filters::Mutate: adding tag {"tag"=>"number1"}
OR
[2019-01-21T15:31:42,778][DEBUG][logstash.pipeline ] filter received {"event"=>{"path"=>"/LOGS/192.168.99.254.log", "message"=>"2019-01-21T15:25:31+01:00 ASA-VIE : %ASA-6-302014: Teardown TCP connection 935611524 for DMZ:xxx.xxx.xxx.xxx/xxx to inside:xxx.xxx.xxx.xxx/xxx duration 0:00:00 bytes 376 TCP FINs from DMZ", "host"=>"LOG10", "type"=>"number2", "@version"=>"1", "@timestamp"=>2019-01-21T14:31:42.017Z}}
[2019-01-21T15:31:42,778][DEBUG][logstash.pipeline ] filter received {"event"=>{"path"=>"/LOGS/192.168.99.254.log", "message"=>"2019-01-21T15:25:31+01:00 ASA-VIE : %ASA-6-302013: Built inbound TCP connection 935611525 for outside:xxx.xxx.xxx.xxx/xxx (xxx.xxx.xxx.xxx/xxx) to DMZ:xxx.xxx.xxx.xxx/xxx/443 (xxx.xxx.xxx.xxx/xxx/443)", "host"=>"LOG10", "type"=>"number2", "@version"=>"1", "@timestamp"=>2019-01-21T14:31:42.017Z}}
[2019-01-21T15:31:42,778][DEBUG][logstash.filters.grok ] Event now: {:event=>#<LogStash::Event:0x602398de>}
[2019-01-21T15:31:42,779][DEBUG][logstash.util.decorators ] filters/LogStash::Filters::GeoIP: adding value to field {"field"=>"[geoip][coordinates]", "value"=>["%{[geoip][longitude]}", "%{[geoip][latitude]}"]}
[2019-01-21T15:31:42,779][DEBUG][logstash.util.decorators ] filters/LogStash::Filters::GeoIP: adding value to field {"field"=>"[geoip][coordinates]", "value"=>["%{[geoip][longitude]}", "%{[geoip][latitude]}"]}
[2019-01-21T15:31:42,779][DEBUG][logstash.util.decorators ] filters/LogStash::Filters::Mutate: adding tag {"tag"=>"cisco-number2"}
So it seems that my logstash config still works (why not) and the lines were processed.
But no index is created. Any ideas what I can trie to verify?
Regards
Wilhelm