I parsed a simple xml file . in first time the logstash parsed the file and sent data to elasticsearch . After that i deleted the index and i tried to parse it again , but logstash do not send data since i have deleted the firt index .
here is my config file :
input {
file {
path => "/home/safaa/Documents/nessus/validate.xml"
start_position => beginning
codec => multiline
{
pattern => "^<\?xmldata .*\>"
negate => true
what => "previous"
}
}
}
filter {
xml {
store_xml => false
source => "message"
xpath =>
[
"/xmldata/head1/id/text()", "id",
"/xmldata/head1/date/text()", "date",
"/xmldata/head1/key1/text()", "key1"
]
}
date {
match => [ "date" , "dd-MM-yyyy HH:mm:ss" ]
timezone => "Europe/Amsterdam"
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
index => "logstash-xml"
hosts => ["127.0.0.1:9200"]
document_id => "%{[id]}"
document_type => "xmlfiles"
}
}
the xml file :
<xmldata>
<head1>
<key1>Value1</key1>
<key2>Value2</key2>
<id>0001</id>
<date>01-01-2016 09:00:00</date>
</head1>
<head2>
<key3>Value3</key3>
</head2>
</xmldata>
the logstash logs :
[2018-10-19T10:18:56,885][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//[127.0.0.1:9200](http://127.0.0.1:9200/)"]}
[2018-10-19T10:18:56,946][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-10-19T10:18:56,993][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-10-19T10:18:58,202][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_99a3e33a61dc7e95f10a1def06b56338", :path=>["/home/safaa/Documents/nessus/validate.xml"]}
[2018-10-19T10:18:58,264][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x6b1d09b3 run>"}
[2018-10-19T10:18:58,426][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-10-19T10:18:58,553][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-10-19T10:18:59,062][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
elasticsearch logs:
[2018-10-19T10:18:00,632][INFO ][o.e.p.PluginsService ] [BqyunZ6] loaded module [x-pack-upgrade]
[2018-10-19T10:18:00,632][INFO ][o.e.p.PluginsService ] [BqyunZ6] loaded module [x-pack-watcher]
[2018-10-19T10:18:00,633][INFO ][o.e.p.PluginsService ] [BqyunZ6] no plugins loaded
[2018-10-19T10:18:05,410][INFO ][o.e.x.s.a.s.FileRolesStore] [BqyunZ6] parsed [0] roles from file [/etc/elasticsearch/roles.yml]
[2018-10-19T10:18:06,288][INFO ][o.e.x.m.j.p.l.CppLogMessageHandler] [controller/15813] [Main.cc@109] controller (64 bit): Version 6.4.2 (Build 660eefe6f2ea55) Copyright (c) 2018 Elasticsearch BV
[2018-10-19T10:18:06,787][DEBUG][o.e.a.ActionModule ] Using REST wrapper from plugin org.elasticsearch.xpack.security.Security
[2018-10-19T10:18:07,161][INFO ][o.e.d.DiscoveryModule ] [BqyunZ6] using discovery type [zen]
[2018-10-19T10:18:08,305][INFO ][o.e.n.Node ] [BqyunZ6] initialized
[2018-10-19T10:18:08,307][INFO ][o.e.n.Node ] [BqyunZ6] starting ...
[2018-10-19T10:18:08,541][INFO ][o.e.t.TransportService ] [BqyunZ6] publish_address {[127.0.0.1:9300](http://127.0.0.1:9300/)}, bound_addresses {[::1]:9300}, {[127.0.0.1:9300](http://127.0.0.1:9300/)}
[2018-10-19T10:18:11,686][INFO ][o.e.c.s.MasterService ] [BqyunZ6] zen-disco-elected-as-master ([0] nodes joined)[, ], reason: new_master {BqyunZ6}{BqyunZ6SQ-Sl1KTH_JCs1Q}{6FeROFyGSEya5ot1AYbhbg}{127.0.0.1}{127.0.0.1:9300}{ml.machine_memory=4125638656, xpack.installed=true, ml.max_open_jobs=20, ml.enabled=true}
[2018-10-19T10:18:11,695][INFO ][o.e.c.s.ClusterApplierService] [BqyunZ6] new_master {BqyunZ6}{BqyunZ6SQ-Sl1KTH_JCs1Q}{6FeROFyGSEya5ot1AYbhbg}{127.0.0.1}{127.0.0.1:9300}{ml.machine_memory=4125638656, xpack.installed=true, ml.max_open_jobs=20, ml.enabled=true}, reason: apply cluster state (from master [master {BqyunZ6}{BqyunZ6SQ-Sl1KTH_JCs1Q}{6FeROFyGSEya5ot1AYbhbg}{127.0.0.1}{127.0.0.1:9300}{ml.machine_memory=4125638656, xpack.installed=true, ml.max_open_jobs=20, ml.enabled=true} committed version [1] source [zen-disco-elected-as-master ([0] nodes joined)[, ]]])
[2018-10-19T10:18:11,784][INFO ][o.e.x.s.t.n.SecurityNetty4HttpServerTransport] [BqyunZ6] publish_address {[127.0.0.1:9200](http://127.0.0.1:9200/)}, bound_addresses {[::1]:9200}, {[127.0.0.1:9200](http://127.0.0.1:9200/)}
[2018-10-19T10:18:11,785][INFO ][o.e.n.Node ] [BqyunZ6] started
[2018-10-19T10:18:12,386][WARN ][o.e.x.s.a.s.m.NativeRoleMappingStore] [BqyunZ6] Failed to clear cache for realms [[]]
[2018-10-19T10:18:12,465][INFO ][o.e.l.LicenseService ] [BqyunZ6] license [f2f7d3aa-fdfc-408f-aae8-15d95982a157] mode [basic] - valid
[2018-10-19T10:18:12,481][INFO ][o.e.g.GatewayService ] [BqyunZ6] recovered [1] indices into cluster_state
[2018-10-19T10:18:12,759][INFO ][o.e.c.r.a.AllocationService] [BqyunZ6] Cluster health status changed from [RED] to [GREEN] (reason: [shards started [[.kibana][0]] ...]).
my elasticstack version: 6.4.2