Index does not appear in kibana

Hi Community,
I'm a total newbie in ELK and so far learning and loving it. I need some help as my index is not appearing in kibana index management area. For your information I'm using Elastic Cloud and Im able to successfully start the logstash from the server i installed it.

another thing to note is that using the same logstash.conf:

  1. if i run in my local [localhost] it works fine and i can see discover the index
  2. if run in cloud [elasticcloudurl:port] - it amble to start but index not appearing

given this i feel that my config is should be fine. only difference for me now is the host.
am i missing something here?

has anyone face the same challenge? anythough or suggestion please.

logstash.conf is like this:
input{
file{
type => "testdemo"
path => "C:/DIR/CSV/*csv"
start_position => "beginning"
}
}

filter{
if [type] == "testdemo" {
csv{
separator => ";"
columns => [ "FirstName" , "LastName" ]
}
}

output{
elasticsearch{
hosts => ["https://someURLhere.elastic-cloud.com:port/"]
index => "index001"
user => "elastic"
password => "somepasswordhere"
}
}

BR,
E.G

  1. i checked the indeces --> :9243/_cat/indices

  2. Kibana

  3. Here is the startup log in the console:
    S C:\Program Files\Logstash\logstash-7.12.1\bin> .\logstash -f "C:\Program Files\Logstash\logstash-7.12.1\config\index001.conf"
    "Using bundled JDK: ""
    OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
    Sending Logstash logs to C:/Program Files/Logstash/logstash-7.12.1/logs which is now configured via log4j2.properties
    [2021-05-15T11:26:44,297][INFO ][logstash.runner ] Log4j configuration path used is: C:\Program Files\Logstash\logstash-7.12.1\config\log4j2.properties
    [2021-05-15T11:26:44,315][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.12.1", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM 11.0.10+9 on 11.0.10+9 +indy +jit [mswin32-x86_64]"}
    [2021-05-15T11:26:44,578][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
    [2021-05-15T11:26:45,619][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
    [2021-05-15T11:26:46,953][INFO ][org.reflections.Reflections] Reflections took 47 ms to scan 1 urls, producing 23 keys and 47 values
    [2021-05-15T11:26:48,186][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[https://elastic:xxxxxx@:9243/]}}
    [2021-05-15T11:26:48,665][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@:9243/"}
    [2021-05-15T11:26:48,835][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
    [2021-05-15T11:26:48,835][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
    [2021-05-15T11:26:48,882][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://:9243"]}
    [2021-05-15T11:26:48,966][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
    [2021-05-15T11:26:49,004][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>6, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>750, "pipeline.sources"=>["C:/Program Files/Logstash/logstash-7.12.1/config/im_whv_index.conf"], :thread=>"#<Thread:0x41c7561 run>"}
    [2021-05-15T11:26:49,035][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
    [2021-05-15T11:26:49,089][INFO ][logstash.outputs.elasticsearch][main] Installing elasticsearch template to _template/logstash
    [2021-05-15T11:26:50,007][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>1.0}
    [2021-05-15T11:26:50,470][INFO ][logstash.inputs.file ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/Program Files/Logstash/logstash-7.12.1/data/plugins/inputs/file/.sincedb_c9dad311c7124d88ca07788f71bc310f", :path=>["C:demofile.csv"]}
    [2021-05-15T11:26:50,493][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
    [2021-05-15T11:26:50,572][INFO ][filewatch.observingtail ][main][f829a8d001328e8d5c8b09ba4b478e2803116b87d772a30bc3a82b8c883ddbd3] START, creating Discoverer, Watch with file and sincedb collections
    [2021-05-15T11:26:50,572][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}

Hi @Bugsbee

Most likely Since you already loaded the file with the localhost Logstash will not read the file again because Logstash keeps track of that. (Which files have been read)

You will need to clean out the logstash data registry or set the sincedb_path to null, it looks like you are on windows put it in the file input block.

Note the exact syntax for windows

sincedb_path => "NUL"

See here to read about it.

seems to do the magic.. now i can see the index in the cat/indeces and in the index pattern. thanks mate.

however, when i run im still getting this error
i tried to add - max_open_files => 102400 to see if it will help however, issue was still the same.
i thoughts on this?

Logs
[2021-05-15T20:53:01,781][ERROR][logstash.codecs.plain ][main][7089847dc0f1b38a0850c69febd0ffedf49a972c775015edccd36a7e3b71d3e3] IdentityMapCodec has reached 100% capacity {:current_size=>20000, :upper_limit=>20000}
warning: thread "[main]<file" terminated with exception (report_on_exception is true):
LogStash::Codecs::IdentityMapCodec::IdentityMapUpperLimitException: LogStash::Codecs::IdentityMapCodec::IdentityMapUpperLimitException
visit at C:/Program Files/Logstash/logstash-7.12.1/vendor/bundle/jruby/2.5.0/gems/logstash-codec-multiline-3.0.10/lib/logstash/codecs/identity_map_codec.rb:37

I would open it separate thread with that message in the subject.

The more precise the subject the better response would get and mentioning logstash

I'm not familiar with that error did you change any other settings and logstash that's really unusual if you're just reading one file.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.