Logstash does not send apache logs to elasticsearch

Oh , I am sorry i cant help you with that but here on a test VM I can confirm that if I start logstash as service it doesn't create any indices However if I run it manually it works.

No worries, i will keep in mind not to start it as a service. THanks a lot for your time

I do not think that's a solution. We need to find why it's happening. I am working on my test VM. Once I will find something I will update here

OK, but i think you pointed in the right direction. I will discard mine docker logstash image and download an official one from here: https://www.elastic.co/guide/en/logstash/current/_pulling_the_image.html

I hope your logstash user can read the logs you are pointing to ?

sudo setfacl -m u:logstash:r /whateveryourloglocation

yes .

-rw-r--r-- 1 test6 test6 159K Jun 23 18:25 admin_access.log

and i am running logstash with user 'test6'

From what i understand from here

https://www.elastic.co/guide/en/logstash/current/_pulling_the_image.html

I think that problem is that my docker image is not properly configured to read the logstash.conf and it acts like no logstash.conf exists.

yes .

-rw-r--r-- 1 test6 test6 159K Jun 23 18:25 admin_access.log

and i am running logstash with user 'test6'

That's necessary but not sufficient. The user must also have execute permisions for all directories leading up to the file.

There seems to be something wrong. I just have done a fresh install in a VM and no indexes being created. Though the permissions are fine the configuration is fine. Also in logstash.yml I have changed the log.level to debug still not logs coming in /var/log/logstash/logstash-plain.log

is something wrong with the binary?

Hi again,
i used docker logstash image from here: Docker,
that docker file with the plugin.

Also, added execute permission to my user:
-rwxr--r-- 1 test6 test6 159K Jun 23 18:25 admin_access.log

The logstash.conf remained the same. And still getting nothing in es.
Here is the logs from logstash:

Sending Logstash's logs to /var/log/logstash which is now configured via log4j2.properties
12:50:34.323 [main] INFO logstash.setting.writabledirectory - Creating directory {:setting=>"path.queue", :path=>"/var/lib/logstash/queue"}
12:50:34.337 [LogStash::Runner] INFO logstash.agent - No persistent UUID file found. Generating new UUID {:uuid=>"4dffb665-e644-4756-a472-8c847c751531", :path=>"/var/lib/logstash/uuid"}
12:50:34.793 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://es:9200/]}}
12:50:34.794 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://es:9200/, :path=>"/"}
12:50:34.847 [[main]-pipeline-manager] WARN logstash.outputs.elasticsearch - Restored connection to ES instance {:url=>#<URI::HTTP:0x539912fe URL:http://es:9200/>}
12:50:34.848 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Using mapping template from {:path=>nil}
12:50:34.994 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash-", "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "omit_norms"=>true}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"format"=>"disabled"}}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"format"=>"disabled"}, "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed", "doc_values"=>true, "ignore_above"=>256}}}}}, {"float_fields"=>{"match"=>"", "match_mapping_type"=>"float", "mapping"=>{"type"=>"float", "doc_values"=>true}}}, {"double_fields"=>{"match"=>"", "match_mapping_type"=>"double", "mapping"=>{"type"=>"double", "doc_values"=>true}}}, {"byte_fields"=>{"match"=>"", "match_mapping_type"=>"byte", "mapping"=>{"type"=>"byte", "doc_values"=>true}}}, {"short_fields"=>{"match"=>"", "match_mapping_type"=>"short", "mapping"=>{"type"=>"short", "doc_values"=>true}}}, {"integer_fields"=>{"match"=>"", "match_mapping_type"=>"integer", "mapping"=>{"type"=>"integer", "doc_values"=>true}}}, {"long_fields"=>{"match"=>"", "match_mapping_type"=>"long", "mapping"=>{"type"=>"long", "doc_values"=>true}}}, {"date_fields"=>{"match"=>"", "match_mapping_type"=>"date", "mapping"=>{"type"=>"date", "doc_values"=>true}}}, {"geo_point_fields"=>{"match"=>"", "match_mapping_type"=>"geo_point", "mapping"=>{"type"=>"geo_point", "doc_values"=>true}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "doc_values"=>true}, "@version"=>{"type"=>"string", "index"=>"not_analyzed", "doc_values"=>true}, "geoip"=>{"type"=>"object", "dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip", "doc_values"=>true}, "location"=>{"type"=>"geo_point", "doc_values"=>true}, "latitude"=>{"type"=>"float", "doc_values"=>true}, "longitude"=>{"type"=>"float", "doc_values"=>true}}}}}}}}
12:50:34.997 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>[#<URI::Generic:0x40b7c4e URL://es:9200>]}
12:50:35.049 [[main]-pipeline-manager] INFO logstash.filters.geoip - Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-geoip-4.1.1-java/vendor/GeoLite2-City.mmdb"}
12:50:35.165 [[main]-pipeline-manager] INFO logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>1000}
12:50:35.332 [[main]-pipeline-manager] INFO logstash.pipeline - Pipeline main started
12:50:35.368 [Api Webserver] INFO logstash.agent - Successfully started Logstash API endpoint {:port=>9600}

Note that no filebeats configuration/image is used.

Increase the log level to debug and look for "glob" and "discover" in the resulting log. I'm interested in whether the input files are seen at all.

when you say increase the log level you want us to do it in logstash.yml or while running it from command line ? because if you see my earlier reply, I did change the log.level in config file but it wasnt showing anything,

when you say increase the log level you want us to do it in logstash.yml or while running it from command line ?

Either way.

because if you see my earlier reply, I did change the log.level in config file but it wasnt showing anything,

It didn't say anything about either "discover" or "glob"?

this is amazing, I have just done a fresh install. Enabled log.level to debug in logstash.yml restarted logstash and bingo !! no logs being generated in /var/log/logstash howerver the path is set. I have also put my conf file in /etc/logstash/conf.d/webserver.conf and yes i do have apache installed and its generating data. The config file is correct. This is amazing ...... any pointers?

woaoooah... by the way i don't have any logstash.yml, just only logstash.conf.
Of course in my situation i don't get still any data in es.
Here is the debug log:
(http://www.mediafire.com/file/yuyqzefpcedtvzu/logstash27_6_2017.log)
Sharma1, can you post the logstash.yml?

Alright it seems it takes some time to start showing the logs, now in my case @magnusbaeck i am seeing globe now. Have a look at below

[2017-06-27T16:15:02,986][INFO ][logstash.pipeline ] Pipeline main started
[2017-06-27T16:15:03,015][DEBUG][logstash.agent ] Starting puma
[2017-06-27T16:15:03,016][DEBUG][logstash.inputs.file ] _globbed_files: /etc/httpd/logs/access_log: glob is: []
[2017-06-27T16:15:03,017][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2017-06-27T16:15:03,018][DEBUG][logstash.api.service ] [api-service] start
[2017-06-27T16:15:03,059][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-06-27T16:15:07,993][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:15:12,994][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:15:17,023][DEBUG][logstash.inputs.file ] _globbed_files: /etc/httpd/logs/access_log: glob is: []
[2017-06-27T16:15:17,994][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:15:22,993][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:15:27,995][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:15:32,036][DEBUG][logstash.inputs.file ] _globbed_files: /etc/httpd/logs/access_log: glob is: []
[2017-06-27T16:15:32,996][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:15:37,995][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:15:42,995][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:15:47,043][DEBUG][logstash.inputs.file ] _globbed_files: /etc/httpd/logs/access_log: glob is: []
[2017-06-27T16:15:47,996][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:15:52,996][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:15:57,995][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:16:02,055][DEBUG][logstash.inputs.file ] _globbed_files: /etc/httpd/logs/access_log: glob is: []
[2017-06-27T16:16:02,996][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:16:07,996][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:16:12,996][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:16:17,140][DEBUG][logstash.inputs.file ] _globbed_files: /etc/httpd/logs/access_log: glob is: []
[2017-06-27T16:16:17,996][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:16:22,998][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:16:27,998][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:16:32,146][DEBUG][logstash.inputs.file ] _globbed_files: /etc/httpd/logs/access_log: glob is: []
[2017-06-27T16:16:32,997][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:16:37,999][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:16:42,999][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:16:47,152][DEBUG][logstash.inputs.file ] _globbed_files: /etc/httpd/logs/access_log: glob is: []
[2017-06-27T16:16:47,999][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:16:53,001][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:16:58,001][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:17:02,158][DEBUG][logstash.inputs.file ] _globbed_files: /etc/httpd/logs/access_log: glob is: []
[2017-06-27T16:17:03,002][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:17:08,004][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:17:13,005][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:17:17,164][DEBUG][logstash.inputs.file ] _globbed_files: /etc/httpd/logs/access_log: glob is: []
[2017-06-27T16:17:18,006][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:17:23,006][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:17:28,006][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-06-27T16:17:32,171][DEBUG][logstash.inputs.file ] _globbed_files: /etc/httpd/logs/access_log: glob is: []

[2017-06-27T16:15:03,016][DEBUG][logstash.inputs.file ] globbedfiles: /etc/httpd/logs/access_log: glob is:

The filename pattern doesn't match any files. This could e.g. be caused by permission problems or a typo in the pattern.

i am also seeing this glob:

14:47:13.739 [[main]<file] DEBUG logstash.inputs.file - _globbed_files: /home/test6/admin_access.log: glob is: []

The answer is the same in that case. How are you starting your container? Are you mounting the host directories in question into the container so that the directories are made available to Logstash?

1 Like