Logstash will not create an index pattern in elastic search

I am an extreme beginner with the elastic stack, I'm simply trying to create a very simple setup with elasticsearch logstash and kibana on my local machine. All of them are installed and are able to run, using versions 7.9.0 for ES apps and 14.0.2 jdk.

I have a simple .conf file for logstash that just reads in some data from a log, and am trying to create an index pattern in kibana but no logstash index will show up no matter what.

Here is the contents of my file

    input {
         file {
                 path => "C:\elk\logs\apache-daily-access.log"
                 start_position => "beginning"
                 sincedb_path => "NUL"
          }
    }

    filter { 
    	grok {
    		match => { "message" => "%{COMBINEDAPACHELOG}" }
    	}
	
	date {
		match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
	}
	
	geoip {
		source => "clientip"
	}
    }

    output {
    	elasticsearch {
    		hosts => ["localhost:9200"]
    	}
    }

Does anyone have any idea whats going on? Here's my logs when I start up logstash

PS C:\elk\logstash-7.9.0> .\bin\logstash.bat -f C:\elk\logstash-7.9.0\config\logstash-simple.conf
Java HotSpot(TM) 64-Bit Server VM warning: Ignoring option UseConcMarkSweepGC; support was removed in 14.0
Java HotSpot(TM) 64-Bit Server VM warning: Ignoring option CMSInitiatingOccupancyFraction; support was removed in 14.0
Java HotSpot(TM) 64-Bit Server VM warning: Ignoring option UseCMSInitiatingOccupancyOnly; support was removed in 14.0
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.ext.openssl.SecurityHelper (file:/C:/Users/SHANEW~1/AppData/Local/Temp/jruby-13068/jruby7169571404682332351jopenssl.jar) to field java.security.MessageDigest.provider
WARNING: Please consider reporting this to the maintainers of org.jruby.ext.openssl.SecurityHelper
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Sending Logstash logs to C:/elk/logstash-7.9.0/logs which is now configured via log4j2.properties
[2020-08-31T12:59:08,014][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.9.0", "jruby.version"=>"jruby 9.2.12.0 (2.5.7) 2020-07-01 db01a49ba6 Java HotSpot(TM) 64-Bit Server VM 14.0.2+12-46 on 14.0.2+12-46 +indy +jit [mswin32-x86_64]"}
[2020-08-31T12:59:09,540][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"C:/elk/logstash-7.9.0/data/queue"}
[2020-08-31T12:59:09,600][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"C:/elk/logstash-7.9.0/data/dead_letter_queue"}
[2020-08-31T12:59:09,720][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-08-31T12:59:10,150][INFO ][logstash.agent           ] No persistent UUID file found. Generating new UUID {:uuid=>"00b56576-4d1f-495d-89c7-3a85a7ecd82b", :path=>"C:/elk/logstash-7.9.0/data/uuid"}
[2020-08-31T12:59:17,013][INFO ][org.reflections.Reflections] Reflections took 1475 ms to scan 1 urls, producing 22 keys and 45 values
[2020-08-31T12:59:28,327][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-08-31T12:59:29,846][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-08-31T12:59:30,397][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-08-31T12:59:30,400][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-08-31T12:59:30,416][WARN ][logstash.outputs.elasticsearch][main] DEPRECATION WARNING: Connecting to an OSS distribution of Elasticsearch using the default distribution of Logstash will stop working in Logstash 8.0.0. Please upgrade to the default distribution of Elasticsearch, or use the OSS distribution of Logstash {:url=>"http://localhost:9200/"}
[2020-08-31T12:59:30,427][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2020-08-31T12:59:30,435][INFO ][logstash.filters.geoip   ][main] Using geoip database {:path=>"C:/elk/logstash-7.9.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-6.0.3-java/vendor/GeoLite2-City.mmdb"}
[2020-08-31T12:59:30,633][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2020-08-31T12:59:31,152][INFO ][logstash.outputs.elasticsearch][main] Index Lifecycle Management is set to 'auto', but will be disabled - Index Lifecycle management is not installed on your Elasticsearch cluster
[2020-08-31T12:59:31,155][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-08-31T12:59:31,421][INFO ][logstash.outputs.elasticsearch][main] Installing elasticsearch template to _template/logstash
[2020-08-31T12:59:32,157][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["C:/elk/logstash-7.9.0/config/logstash-simple.conf"], :thread=>"#<Thread:0x40c715ea run>"}
[2020-08-31T12:59:48,040][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>15.88}
[2020-08-31T12:59:50,340][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-08-31T12:59:50,503][INFO ][filewatch.observingtail  ][main][188ea7efcfeacc1fab3eb70d530d6234cbbfdfda67732e3c12a1dfcbc2066aa7] START, creating Discoverer, Watch with file and sincedb collections
[2020-08-31T12:59:50,999][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-08-31T12:59:52,369][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Change your output to that.

In Kibana -> Index Management do you see test_index index there?

If so go to Index Patterns and create a new index pattern using the index name. If you don't have a template setup you will need to create your own index pattern.

Do not use backslash in the path option of a file input, it is interpreted as an escape. Use forward slash.

Wow, that fixed it. Thank you so much I was going crazy

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.