Logstash not created indexes

Hi, pls tell me why new indexes not not displayed in Kibana
logstash.json:

input {
 file {
    type => "pikautotesttc4"
    path => "C:/Users/bezzbtsev/Desktop/pik2/RTS-PIK/dms-selenium-tests/TestSelenium/bin/Debug/Logs/**/*.log*"
    mode => "tail"
    start_position => "beginning"
    codec => plain { charset => "Windows-1251" }
    sincedb_path => "nul"
  }
  file {
    type => "runstatus4"
    path => "C:/Users/bezzbtsev/Desktop/pik2/RTS-PIK/dms-selenium-tests/TestSelenium/bin/Debug/runStatus.log"
    mode => "tail"
    start_position => "beginning"
    codec => plain { charset => "Windows-1251" }
    sincedb_path => "nul"
  }
  file {
    type => "pikautotesttc12and5"
    path => "C:/Users/bezzbtsev/Desktop/pik/RTS-PIK/dms-selenium-tests/TestSelenium/bin/Debug/Logs/**/*.log*"
    mode => "tail"
    start_position => "beginning"
    codec => plain { charset => "Windows-1251" }
    sincedb_path => "nul"
  }
  file {
    type => "runstatus12and5"
    path => "C:/Users/bezzbtsev/Desktop/pik/RTS-PIK/dms-selenium-tests/TestSelenium/bin/Debug/runStatus.log"
    mode => "tail"
    start_position => "beginning"
    codec => plain { charset => "Windows-1251" }
    sincedb_path => "nul"
  }
  file {
    type => "pikautotesttc"
    path => "C:/BuildAgent/work/853f30880bfa0ff8/dms-selenium-tests/TestSelenium/bin/Debug/Logs/**/*.log*"
    mode => "tail"
    start_position => "beginning"
    codec => plain { charset => "Windows-1251" }
    sincedb_path => "nul"
  }
  file {
    type => "runstatus"
    path => "C:/BuildAgent/work/853f30880bfa0ff8/dms-selenium-tests/TestSelenium/bin/Debug/runStatus.log"
    mode => "tail"
    start_position => "beginning"
    codec => plain { charset => "Windows-1251" }
    sincedb_path => "nul"
  }
  file {
    type => "pikautotesttc12"
    path => "C:/BuildAgent2/work/853f30880bfa0ff8/dms-selenium-tests/TestSelenium/bin/Debug/Logs/**/*.log*"
    mode => "tail"
    start_position => "beginning"
    codec => plain { charset => "Windows-1251" }
    sincedb_path => "nul"
  }
  file {
    type => "runstatus12"
    path => "C:/BuildAgent2/work/853f30880bfa0ff8/dms-selenium-tests/TestSelenium/bin/Debug/runStatus.log"
    mode => "tail"
    start_position => "beginning"
    codec => plain { charset => "Windows-1251" }
    sincedb_path => "nul"
  }
}

filter {
  fingerprint {
    source => "message"
    target => "[@metadata][fingerprint]"
    method => "MD5"
    key => "pik"
  }
  if [type] == "runstatus" or [type] == "runstatus12"
  {
    grok {
        match => {
        "message" => "%{DATESTAMP:date}\s+%{WORD:loglevel}\s+(\[\d+\])?\s+:\s*Сценарий - (?<scenario>.*?)(?=\;)\;\sссылка на контракт - (?<positionUrl>.*)(?=\;)\; попытка \((?<attempt>\d)\/5\) - (?<status>.*)(?=\;)\;\s?(?<screenshot>(.*)?)"
        }
      }
      date {
      match => ["date", "yy-MM-dd HH:mm:ss,SSS"]
      target => "@timestamp"
    }
  }
  if [type] == "pikautotesttc" or [type] == "pikautotesttc12"
  {
      if "URL:" in [message]
    {
      grok {
        match => {
        "message" => "%{DATESTAMP:logdate}\s+%{WORD:loglevel}\s+(\[\d+\])?\s+:\s*%{GREEDYDATA:msgbody}(?= URL: )?( URL: )%{GREEDYDATA:url}(?=\.)\.( User: )?%{GREEDYDATA:user}"
        }
      }
    }
    else
    {
     grok {
        match => {
          "message" => "%{DATESTAMP:logdate}\s+%{WORD:loglevel}\s+(\[\d+\])?\s+:\s*%{GREEDYDATA:msgbody}"
        }
      }
    }
    grok {
      match => { 
        "msgbody" => [
          "Test (?<status>[^&]*)",
          "Время выполнения (контракта|теста) \(первая попытка\): (?<duration>\d+.\d+)"
        ]
      }
      match => {
        "path" => "log\.?(?<attempt>\d)"
      }
      break_on_match => false
    }
    date {
      match => ["logdate", "yy-MM-dd HH:mm:ss,SSS"]
      target => "@timestamp"
    }
    mutate {
      convert => {
        "duration" => "float"
      }
    }
  }
}

output {
  elasticsearch {
    hosts => "localhost:9200"
    manage_template => true
    index => "logstash-%{type}"
    document_id => "%{[@metadata][fingerprint]}"
  }
}

for some reason only these are displayed =>

not see: "pikautotesttc4", "pikautotesttc12and5", "runstatus12and5", "runstatus4"

logs LOGSTASH:

[2021-06-17T18:14:34,412][INFO ][logstash.runner          ] Log4j configuration path used is: C:\ELK\logstash\config\log4j2.properties
[2021-06-17T18:14:34,443][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.13.2", "jruby.version"=>"jruby 9.2.16.0 (2.5.7) 2021-03-03 f82228dc32 OpenJDK 64-Bit Server VM 11.0.11+9 on 11.0.11+9 +indy +jit [mswin32-x86_64]"}
[2021-06-17T18:14:34,677][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2021-06-17T18:14:39,169][INFO ][logstash.monitoring.internalpipelinesource] Monitoring License OK
[2021-06-17T18:14:39,169][INFO ][logstash.monitoring.internalpipelinesource] Validated license for monitoring. Enabling monitoring pipeline.
[2021-06-17T18:14:40,234][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2021-06-17T18:14:41,110][INFO ][org.reflections.Reflections] Reflections took 110 ms to scan 1 urls, producing 24 keys and 48 values 
[2021-06-17T18:14:42,626][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearchMonitoring", :hosts=>["http://localhost:9200"]}
[2021-06-17T18:14:42,719][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://logstash_internal:xxxxxx@localhost:9200/]}}
[2021-06-17T18:14:42,782][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Restored connection to ES instance {:url=>"http://logstash_internal:xxxxxx@localhost:9200/"}
[2021-06-17T18:14:42,813][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Elasticsearch version determined (7.13.2) {:es_version=>7}
[2021-06-17T18:14:42,813][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2021-06-17T18:14:43,036][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Configuration is data stream compliant but due backwards compatibility Logstash 7.x will not assume writing to a data-stream, default behavior will change on Logstash 8.0 (set `data_stream => true/false` to disable this warning)
[2021-06-17T18:14:43,036][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Configuration is data stream compliant but due backwards compatibility Logstash 7.x will not assume writing to a data-stream, default behavior will change on Logstash 8.0 (set `data_stream => true/false` to disable this warning)
[2021-06-17T18:14:43,082][WARN ][logstash.javapipeline    ][.monitoring-logstash] 'pipeline.ordered' is enabled and is likely less efficient, consider disabling if preserving event order is not necessary
[2021-06-17T18:14:43,254][INFO ][logstash.javapipeline    ][.monitoring-logstash] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, "pipeline.sources"=>["monitoring pipeline"], :thread=>"#<Thread:0x3ef26b9c run>"}
[2021-06-17T18:14:45,958][INFO ][logstash.javapipeline    ][.monitoring-logstash] Pipeline Java execution initialization time {"seconds"=>2.7}
[2021-06-17T18:14:46,037][INFO ][logstash.javapipeline    ][.monitoring-logstash] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
[2021-06-17T18:14:47,519][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2021-06-17T18:14:47,551][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://logstash_internal:xxxxxx@localhost:9200/]}}
[2021-06-17T18:14:47,566][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://logstash_internal:xxxxxx@localhost:9200/"}
[2021-06-17T18:14:47,566][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.13.2) {:es_version=>7}
[2021-06-17T18:14:47,566][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2021-06-17T18:14:47,660][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2021-06-17T18:14:47,738][INFO ][logstash.filters.elasticsearch][main] New ElasticSearch filter client {:hosts=>["//localhost:9200"]}
[2021-06-17T18:14:48,100][ERROR][logstash.javapipeline    ][main] Pipeline error {:pipeline_id=>"main", :exception=>#<Manticore::ClientProtocolException: URI does not specify a valid host name: http:////localhost:9200/>, :backtrace=>["C:/ELK/logstash/vendor/bundle/jruby/2.5.0/gems/manticore-0.7.0-java/lib/manticore/response.rb:37:in `block in initialize'", "C:/ELK/logstash/vendor/bundle/jruby/2.5.0/gems/manticore-0.7.0-java/lib/manticore/response.rb:79:in `call'", "C:/ELK/logstash/vendor/bundle/jruby/2.5.0/gems/manticore-0.7.0-java/lib/manticore/response.rb:274:in `call_once'", "C:/ELK/logstash/vendor/bundle/jruby/2.5.0/gems/manticore-0.7.0-java/lib/manticore/response.rb:158:in `code'", "C:/ELK/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/transport/http/manticore.rb:84:in `block in perform_request'", "C:/ELK/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/transport/base.rb:262:in `perform_request'", "C:/ELK/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/transport/http/manticore.rb:67:in `perform_request'", "C:/ELK/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/client.rb:131:in `perform_request'", "C:/ELK/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-api-5.0.5/lib/elasticsearch/api/actions/ping.rb:20:in `ping'", "C:/ELK/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-elasticsearch-3.9.3/lib/logstash/filters/elasticsearch.rb:310:in `test_connection!'", "C:/ELK/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-elasticsearch-3.9.3/lib/logstash/filters/elasticsearch.rb:117:in `register'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in `register'", "C:/ELK/logstash/logstash-core/lib/logstash/java_pipeline.rb:228:in `block in register_plugins'", "org/jruby/RubyArray.java:1809:in `each'", "C:/ELK/logstash/logstash-core/lib/logstash/java_pipeline.rb:227:in `register_plugins'", "C:/ELK/logstash/logstash-core/lib/logstash/java_pipeline.rb:586:in `maybe_setup_out_plugins'", "C:/ELK/logstash/logstash-core/lib/logstash/java_pipeline.rb:240:in `start_workers'", "C:/ELK/logstash/logstash-core/lib/logstash/java_pipeline.rb:185:in `run'", "C:/ELK/logstash/logstash-core/lib/logstash/java_pipeline.rb:137:in `block in start'"], "pipeline.sources"=>["C:/ELK/logstash/bin/logstash.json"], :thread=>"#<Thread:0x28ac354d@C:/ELK/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:54 run>"}
[2021-06-17T18:14:48,100][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2021-06-17T18:14:48,115][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2021-06-17T18:14:50,110][INFO ][logstash.javapipeline    ][.monitoring-logstash] Pipeline terminated {"pipeline.id"=>".monitoring-logstash"}
[2021-06-17T18:14:50,648][INFO ][logstash.runner          ] Logstash shut down.

the directories before the logs are correct, previously it was necessary to give a special role to the user, or not. Help please

Hi,

The host seems to be incorrect.
Replace hosts => "localhost:9200"
With hosts => "http://localhost:9200"

Cad.

it was about the output block, i created user (password and login) and gave him the rights

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.