Hi Luca,
Thanks for the quick reply.
I created a index as follow:
PUT _template/ecs-file
{
"order" : 0,
"index_patterns" : [
"ecs-file-*"
],
"settings" : {
"index" : {
"lifecycle" : {
"name" : "delete_after_7days_3primaries",
"rollover_alias" : "ecs-file-*"
}
}
},
"mappings": { },
"aliases" : { }
}
PUT ecs-file-000001
{
"aliases": {
"ecs-file": {
"is_write_index": true
}
}
}
cat indices
GET _cat/indices?
Output
green open ecs-file-000001
I have restarted the logstash in this is the logging i get:
[2020-04-12T03:28:01,100][INFO ][logstash.javapipeline ][testfile] Pipeline started {"pipeline.id"=>"testfile}
[2020-04-12T03:28:01,281][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:alipress], :non_running_pipelines=>[]}
[2020-04-12T03:28:01,523][INFO ][filewatch.observingtail ][testfile] START, creating Discoverer, Watch with filend sincedb collections
[2020-04-12T03:28:02,318][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600
[2020-04-12T03:30:01,069][WARN ][logstash.runner ] SIGTERM received. Shutting down.
[2020-04-12T03:30:01,250][INFO ][filewatch.observingtail ] QUIT - closing all files and shutting down.
[2020-04-12T03:30:01,667][INFO ][logstash.javapipeline ] Pipeline terminated {"pipeline.id"=>"testfile"}
[2020-04-12T03:30:02,602][INFO ][logstash.runner ] Logstash shut down.
[2020-04-12T03:30:42,664][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.4.2"}
[2020-04-12T03:30:46,651][INFO ][org.reflections.Reflections] Reflections took 149 ms to scan 1 urls, producing 20 ys and 40 values
[2020-04-12T03:30:49,188][INFO ][logstash.outputs.elasticsearch][testfile] Elasticsearch pool URLs updated {:chans=>{:removed=>[], :added=>[http://192.168.23.135:9200/]}}
[2020-04-12T03:30:49,802][WARN ][logstash.outputs.elasticsearch][testfile] Restored connection to ES instance {:u=>"http://192.168.23.135:9200/"}
[2020-04-12T03:30:49,930][INFO ][logstash.outputs.elasticsearch][testfile] ES Output version determined {:es_versn=>7}
[2020-04-12T03:30:49,948][WARN ][logstash.outputs.elasticsearch][testfile] Detected a 6.x and above cluster: the ype` event field won't be used to determine the document _type {:es_version=>7}
[2020-04-12T03:30:50,017][INFO ][logstash.outputs.elasticsearch][testfile] New Elasticsearch output {:class=>"Logash::Outputs::ElasticSearch", :hosts=>["//192.168.23.135"]}
[2020-04-12T03:30:50,263][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][testfile] A gauge metc of an unknown type (org.jruby.RubyArray) has been create for key: cluster_uuids. This may result in invalid seriazation. It is recommended to log an issue to the responsible developer/development team.
[2020-04-12T03:30:50,319][INFO ][logstash.javapipeline ][testfile] Starting pipeline {:pipeline_id=>"testfile, "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, :thrd=>"#<Thread:0x31c7a556 run>"}
[2020-04-12T03:30:50,324][INFO ][logstash.outputs.elasticsearch][testfile] Using default mapping template
[2020-04-12T03:30:50,538][INFO ][logstash.outputs.elasticsearch][testfile] Attempting to install template {:managtemplate=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_oshards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"ring", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"strin, "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], roperties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "propertie=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type>"half_float"}}}}}}}
[2020-04-12T03:30:51,417][INFO ][logstash.inputs.file ][testfile] No sincedb_path set, generating one based othe "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_9ad389108014868079fae20d6a174d21 :path=>["/var/log/logstash/data/data*.json"]}
[2020-04-12T03:30:51,544][INFO ][logstash.javapipeline ][testfile] Pipeline started {"pipeline.id"=>"testfile}
[2020-04-12T03:30:51,847][INFO ][filewatch.observingtail ][testfile] START, creating Discoverer, Watch with filend sincedb collections
[2020-04-12T03:30:51,876][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:alipress], :non_running_pipelines=>[]}
[2020-04-12T03:30:52,954][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600
When i remove the index it automatically creating the logstash index. It look likes logstash is not sending the json file.