Logstash exits after starting with no error messages


#1

I am getting a very similar experience that was mentioned here on this post, however, I'm not using Docker, I'm just using a basic .tar and turning logstash on from the command line

After having a mixed experience with the apt package on Ubuntu 16.04, I uninstalled and downloaded the tar.gz version, am running logstash from the home directory.

for some reason, it exits after successfully connecting instead of remaining on...

$ ./bin/logstash
Sending Logstash's logs to /home/cac/logstash/logs which is now configured via log4j2.properties
[2018-06-21T21:04:33,614][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-06-21T21:04:34,785][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.3.0"}
[2018-06-21T21:04:42,036][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-06-21T21:04:42,948][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://172.31.16.135:9200/]}}
[2018-06-21T21:04:42,966][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://172.31.16.135:9200/, :path=>"/"}
[2018-06-21T21:04:43,323][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://172.31.16.135:9200/"}
[2018-06-21T21:04:43,433][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-06-21T21:04:43,443][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-06-21T21:04:43,489][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//172.31.16.135:9200"]}
[2018-06-21T21:04:44,317][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/home/cac/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2018-06-21T21:04:44,528][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x73608a56 run>"}
[2018-06-21T21:04:44,653][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-06-21T21:04:45,268][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9601}
[2018-06-21T21:04:45,512][INFO ][logstash.pipeline        ] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x73608a56 run>"}
$

$LOGSTASH_HOME/config/logstash.yml

path.config: ~/logstash/config/conf.d
http.host: "172.31.25.60"
http.port: 9600-9700

$LOGSTASH_HOME/config/conf.d

$ ~/logstash/config/conf.d$ ls -la
total 12
drwxr-xr-x 2 root root 4096 Jun 21 20:52 .
drwxrwxr-x 3 cac  cac  4096 Jun 21 21:21 ..
-rw-r--r-- 1 root root 1799 Jun 21 20:38 nginx.conf

$LOGSTASH_HOME/config/pipelines.yml

# empty

$LOGSTASH_HOME/config/conf.d/nginx.conf

filter {
  if [fileset][module] == "nginx" {
     if [fileset][name] == "access" {
       grok {
         match => { "message" => ["%{IPORHOST:[nginx][access][remote_ip]} - %{DATA:[nginx][access][user_name]} \[%{HTTPDATE:[nginx][access][time]}\] \"%{WORD:[nginx][access][method]} %{DATA:[nginx][access][url]} HTTP/%{NUMBER:[nginx][access][http_version]}\" %{NUMBER:[nginx][access][response_code]} %{NUMBER:[nginx][access][body_sent][bytes]} \"%{DATA:[nginx][access][referrer]}\" \"%{DATA:[nginx][access][agent]}\""] }
         remove_field => "message"
       }
       mutate {
         add_field => { "read_timestamp" => "%{@timestamp}" }
       }
       date {
         match => [ "[nginx][access][time]", "dd/MMM/YYYY:H:m:s Z" ]
         remove_field => "[nginx][access][time]"
       }
       useragent {
         source => "[nginx][access][agent]"
         target => "[nginx][access][user_agent]"
         remove_field => "[nginx][access][agent]"
       }
       geoip {
         source => "[nginx][access][remote_ip]"
         target => "[nginx][access][geoip]"
       }
    }
    else if [fileset][name] == "error" {
      grok {
        match => { "message" => ["%{DATA:[nginx][error][time]} \[%{DATA:[nginx][error][level]}\] %{NUMBER:[nginx][error][pid]}#%{NUMBER:[nginx][error][tid]}: (\*%{NUMBER:[nginx][error][connection_id]} )?%{GREEDYDATA:[nginx][error][message]}"] }
        remove_field => "message"
      }
      mutate {
        rename => { "@timestamp" => "read_timestamp" }
      }
      date {
        match => [ "[nginx][error][time]", "YYYY/MM/dd H:m:s" ]
        remove_field => "[nginx][error][time]"
      }
    }
  }
}
output {
  elasticsearch {
    hosts => ["172.31.16.135:9200"]
    manage_template => false
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
  }
}

any help is appriciated, thank you


#2

I do not see an input defined in your configuration.


#3

the apt version in the past just runs with default configs - are you saying that I need to define an input plugin for it to run?


(Magnus B├Ąck) #4

are you saying that I need to define an input plugin for it to run?

Yes. Unless you have at least one input plugin Logstash won't have nothing to do.


(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.