Docker logstash is not processing input

I have been using a single solution container for running elastic, logstash and kibana and I am trying the images at elastic.co

So I am trying with

> FROM docker.elastic.co/logstash/logstash:6.0.0
> COPY ./config/logstash/conf.d /usr/share/logstash/pipelines/
> COPY ./config/logstash/patterns /usr/share/logstash/patterns_extra
> 
> #RUN cd /usr/share/logstash/ && bin/logstash-plugin install logstash-input-elasticsearch

And input config i am loading there is called 09-appname.conf with content

input { 
  file {
    path => "/tmp/logs/application*.log"
    start_position => "beginning"
    sincedb_path => "/dev/null"
    #codec => plain {
	#	charset => "ISO-8859-1"
	#}
	codec => multiline {
	  # Grok pattern names are valid! :)
	  pattern => "^%{YEAR}"
	  negate => true
	  what => previous
	  charset => "ISO-8859-1"
	}
  }
}

filter {
  grok {
    match => { "message" => "%{DATESTAMP:timestamp} \[%{LOGLEVEL:log-level}\] \[(?<app>%{WORD}(\.%{WORD})?)\] %{GREEDYDATA:message}" }
    patterns_dir => ["/usr/share/logstash/patterns", "/usr/share/logstash/patterns_extra"]
  }
  date {
    match => [ "timestamp" , "yyyy/MM/dd HH:mm:ss" ]
    #match => [ "timestamp" , "yyyy/MM/dd HH:mm:ss", "ISO8601" ]
  }
}

output {
  elasticsearch { 
    hosts => ["http://elasticsearch:9200"]
    #protocol => http
    index => "myappindex"
  }
  stdout { codec => rubydebug }
}

The above works well with my previous stack and when i docker cp (copy) application.log files in /tmp/logs in the logstash container i would expect the logstash input to pick the files up and process them into requests to elastic search. I tried using both /usr/share/logstash/pipelines/ and /usr/share/logstash/config and i have seen the files loaded in these folders (from documentation it looks like version 6 uses pipelines as a separate concept from other config). Still logstash does nothing much. After it starts

> logstash_1       | [2017-12-03T23:56:59,203][INFO ][logstash.agent           ] Pipelines running {:count=>2, :pipelines=>[".monitoring-logstash", "main"]}
> logstash_1       | [2017-12-03T23:56:59,205][INFO ][logstash.inputs.metrics  ] Monitoring License OK

I dont see anything else in the logs and the index is not created in ES. I have tested my kibana and ES with the shakespear data ( a sample data used in a tutorial) and it works. Any suggestions on what I am missing?


Also notice I am trying to install the logstash-input-elasticsearch plugin but when i try to do it my docker image fails

Step 5/5 : RUN cd /usr/share/logstash/ && bin/logstash-plugin install logstash-input-elasticsearch
 ---> Running in 25c0fcc24880
Validating logstash-input-elasticsearch
Unable to download data from https://rubygems.org - SocketError: Failed to open TCP connection to rubygems.org:443 (initialize: name or service not known) (https://rubygems.org/latest_specs.4.8.gz)
ERROR: Installation aborted, verification failed for logstash-input-elasticsearch 
ERROR: Service 'logstash' failed to build: The command '/bin/sh -c cd /usr/share/logstash/ && bin/logstash-plugin install logstash-input-elasticsearch' returned a non-zero code: 1

Have you double-checked that the files really are accessible within the container? Stepping into the container with docker exec and running ls /tmp/logs/application*.log would tell you that.

Increasing Logstash's logging verbosity should both cause the current configuration to get logged (allowing you to check that you're running with the expected configuration) and it should give you more clues about what the file input is doing.

I was loading the pipelines in the wrong place. It is not /usr/share/logstash/pipelines/ but /usr/share/logstash/pipeline/ :confused:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.