Hello all!
I'm using Windows 10 x64. First, I'm running bin/elasticsearch from cmd.
http://localhost:9200/_cat/indices says:
green open .kibana_task_manager LSRA1gP5TCevDafKx51ecQ 1 0 2 0 53.6kb 53.6kb
green open .kibana_1 bfa6VzYmSby-BzJkFYcxoQ 1 0 4 1 23kb 23kb
when http://localhost:9200/_cat/health says:
1560922509 05:35:09 elasticsearch green 1 1 2 2 0 0 0 0 - 100.0%
Then, I'm trying to use bin/logstash -f logstash.conf --verbose with result:
C:\l>.\bin\logstash -f logstash.conf --verbose
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.runtime.encoding.EncodingService (file:/C:/l/logstash-core/lib/jars/jruby-complete-9.2.7.0.jar) to field java.io.Console.cs
WARNING: Please consider reporting this to the maintainers of org.jruby.runtime.encoding.EncodingService
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Sending Logstash logs to C:/l/logs which is now configured via log4j2.properties
[2019-06-18T22:34:15,431][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-06-18T22:34:15,458][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2019-06-18T22:34:25,254][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://127.0.0.1:9200/]}}
[2019-06-18T22:34:25,671][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://127.0.0.1:9200/"}
[2019-06-18T22:34:26,285][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-06-18T22:34:26,299][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: thetype
event field won't be used to determine the document _type {:es_version=>7}
[2019-06-18T22:34:26,352][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["//127.0.0.1"]}
[2019-06-18T22:34:26,375][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-06-18T22:34:26,925][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, :thread=>"#<Thread:0x5e3bbdb8 run>"}
[2019-06-18T22:34:26,970][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-06-18T22:34:29,218][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/l/data/plugins/inputs/file/.sincedb_0668632417a0cfd5001d478541fc9478", :path=>["C:\l\example.txt"]}
[2019-06-18T22:34:29,414][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-06-18T22:34:29,710][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-06-18T22:34:29,738][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-06-18T22:34:33,847][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
example.txt content is:
532452345
25234534
245234
logstash.conf settings are:
input {
file{
path => "C:\l\example.txt"
start_position => "beginning"
}
}filter {
grok{
match => { "message" => "%{NUMBER:line_content}" }
}
}
output {
elasticsearch {
index => "test"
}
}
ELK configurations by default.
After that, the http://localhost:9200/_cat/indices shows:
green open .kibana_task_manager LSRA1gP5TCevDafKx51ecQ 1 0 2 0 53.6kb 53.6kb
green open .kibana_1 bfa6VzYmSby-BzJkFYcxoQ 1 0 4 1 23kb 23kb
What happens?