LOGSTASH, simple code unable to get output in elasticsearch and kibana

Hello all,

I am a beginner in elk, i am trying to execute logstash ( through power shell). when ever i execute the fallowing command :-

C:\siem> .\logstash-7.4.1\bin\logstash.bat -f .\myfirst.conf

the output on the powershell is this:-

Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.runtime.encoding.EncodingService (file:/C:/siem/logstash-7.4.1/logstash-core/lib/jars/jruby-complete-9.2.8.0.jar) to field java.io.Console.cs
WARNING: Please consider reporting this to the maintainers of org.jruby.runtime.encoding.EncodingService
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Thread.exclusive is deprecated, use Thread::Mutex
Sending Logstash logs to C:/siem/logstash-7.4.1/logs which is now configured via log4j2.properties
[2019-11-06T13:02:04,511][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-11-06T13:02:04,528][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.4.1"}
[2019-11-06T13:02:07,219][INFO ][org.reflections.Reflections] Reflections took 63 ms to scan 1 urls, producing 20 keys and 40 values
[2019-11-06T13:02:08,703][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2019-11-06T13:02:08,999][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-11-06T13:02:09,063][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2019-11-06T13:02:09,067][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2019-11-06T13:02:09,103][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-11-06T13:02:09,203][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2019-11-06T13:02:09,263][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2019-11-06T13:02:09,277][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0xbe34079 run>"}
[2019-11-06T13:02:09,325][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-11-06T13:02:10,318][INFO ][logstash.inputs.file ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/siem/logstash-7.4.1/data/plugins/inputs/file/.sincedb_f5fa93b0623f6608d9a7bf96966c81e5", :path=>["C:\siem\logs_for_filebeat\logstash-tutorial-dataset.txt"]}
[2019-11-06T13:02:10,354][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2019-11-06T13:02:10,510][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-11-06T13:02:10,526][INFO ][filewatch.observingtail ][main] START, creating Discoverer, Watch with file and sincedb collections
[2019-11-06T13:02:11,540][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Now if i go and check in elasticsearch----> index managament, I am unable to find the "myfirst" (index) and even in kibana as well, which i have mentioned in the configuration file. The logstash configuration file is like this:-- myfirst.conf

input {
file {
path => "C:\siem\logs_for_filebeat\logstash-tutorial-dataset.txt"
start_position => "beginning"
}
}
output {
elasticsearch { hosts => ["localhost:9200"]
index => "myfirst"
}
stdout{ codec => rubydebug}

     }	

versions of all 3:-

elasticsearch-7.4.1
kibana-7.4.1-windows-x86_64
logstash-7.4.1

I dont know where it is going wrong.

So, could you please help me to understand or educate more on this and down the line.

Thanks in advance

Do not use backslash in the path option of a file input. Use forward slash.

Hello Badger,

as per your specifications, i have changed the path to the fallowing:-

path => "C:/siem/logs_for_filebeat/logstash-tutorial-dataset.txt"

and i have ran the command in powershell windows

PS C:\siem> .\logstash-7.4.1\bin\logstash.bat -f .\myfirst.conf

The logs on the powershell as fallows:-

Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.runtime.encoding.EncodingService (file:/C:/siem/logstash-7.4.1/logstash-core/lib/jars/jruby-complete-9.2.8.0.jar) to field java.io.Console.cs
WARNING: Please consider reporting this to the maintainers of org.jruby.runtime.encoding.EncodingService
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Thread.exclusive is deprecated, use Thread::Mutex
Sending Logstash logs to C:/siem/logstash-7.4.1/logs which is now configured via log4j2.properties
[2019-11-06T15:58:29,213][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-11-06T15:58:29,245][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.4.1"}
[2019-11-06T15:58:32,252][INFO ][org.reflections.Reflections] Reflections took 51 ms to scan 1 urls, producing 20 keys and 40 values
[2019-11-06T15:58:34,197][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2019-11-06T15:58:34,500][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-11-06T15:58:34,563][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2019-11-06T15:58:34,572][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2019-11-06T15:58:34,618][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-11-06T15:58:34,793][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2019-11-06T15:58:34,822][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2019-11-06T15:58:34,847][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x154920b1 run>"}
[2019-11-06T15:58:34,906][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-11-06T15:58:36,239][INFO ][logstash.inputs.file ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/siem/logstash-7.4.1/data/plugins/inputs/file/.sincedb_5be0d94968a6f971cbb160730d8a33e2", :path=>["C:/siem/logs_for_filebeat/logstash-tutorial-dataset.txt"]}
[2019-11-06T15:58:36,314][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2019-11-06T15:58:36,418][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-11-06T15:58:36,506][INFO ][filewatch.observingtail ][main] START, creating Discoverer, Watch with file and sincedb collections
[2019-11-06T15:58:38,036][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

When i checked in elasticsearch --> index management , i could not see the myfirst
index still.

Try enabling log.level trace, restarting, and then appending a line to your text file. The filewatch code should tell you when it outputs an event with trace logging enabled.

Hello Badger,

  1. log.level trace (i dont know how, but i will search in the web and i will find out.)
  2. restarting ( it can be done)
    3.appending a line to text file ( it can be done)
  3. filewatch code ( i dont know how, but i will search in the web and i will find out.)
  4. trace logging enabled ( i dont know how, but i will search in the web and i will find out.)

Sorry I am a beginner.

I wil get back to you soon

Thank you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.