No logstash output to kibana/elasticsearch


(Patrick Grzechca) #1

Good day to you!

I am pretty new to ELK stack and i am working on a issue since 3 days.

I got an apache-access.log but i can't send the logfile through kibana.
Using Kibana 5.1.1, elasticsearch 5.1.1, logstash 5.1.1

Tried to keep my config file very simple:
input {
file {
path => "/home/grzechca/logfolder/jobvector_de_access.*"
type => "apache-access"
start_position => "beginning"
ignore_older => 0
sincedb_path => "/dev/null"
}
}

    output {

       elasticsearch { hosts => ["localhost:9200"] }
        stdout { }
      }

I start logstash with: sudo /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/apache_to_elastic_Test01.conf

WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs to console
13:35:29.103 [[main]-pipeline-manager] INFO  logstash.outputs.elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>["http://localhost:9200"]}}
13:35:29.107 [[main]-pipeline-manager] INFO  logstash.outputs.elasticsearch - Running health check to see if an Elasticsearch connection is working {:url=>#<URI::HTTP:0x4084382a URL:http://localhost:9200>, :healthcheck_path=>"/"}
13:35:29.294 [[main]-pipeline-manager] WARN  logstash.outputs.elasticsearch - Restored connection to ES instance {:url=>#<URI::HTTP:0x4084382a URL:http://localhost:9200>}
13:35:29.302 [[main]-pipeline-manager] INFO  logstash.outputs.elasticsearch - Using mapping template from {:path=>nil}
13:35:29.413 [[main]-pipeline-manager] INFO  logstash.outputs.elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
13:35:29.425 [[main]-pipeline-manager] INFO  logstash.outputs.elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["localhost:9200"]}
13:35:29.433 [[main]-pipeline-manager] INFO  logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125}
13:35:29.434 [[main]-pipeline-manager] INFO  logstash.pipeline - Pipeline main started
13:35:29.574 [Api Webserver] INFO  logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
^C13:35:29.777 [SIGINT handler] WARN  logstash.runner - SIGINT received. Shutting down the agent.
2017-01-05T12:35:29.060Z ELK 198.100.145.140 - - [03/Jan/2017:06:25:10 +0100] "GET /en/index.html?__ajaxMethod=ajax_vacancy_search&keywords=&locations=&locations_lat=&locations_lng=&locations_string=&country%5B%5D=44&country%5B%5D=s2&country%5B%5D=s10&country%5B%5D=s1&country%5B%5D=s7&country%5B%5D=s11&country%5B%5D=s9&country%5B%5D=s3&country%5B%5D=s6&country%5B%5D=s14&country%5B%5D=s15&country%5B%5D=s16&country%5B%5D=s13&country%5B%5D=s4&country%5B%5D=s8&country%5B%5D=s12&country%5B%5D=s5&country%5B%5D=8&country%5B%5D=s22&country%5B%5D=s24&country%5B%5D=s25&country%5B%5D=s19&country%5B%5D=s20&country%5B%5D=s23&country%5B%5D=s17&country%5B%5D=s18&country%5B%5D=s21&sort=_score&_pn=30 HTTP/1.1" 200 47545 "http://www.jobvector.de/en/search-jobs.html?keywords=&locations=&locations_lat=&locations_lng=&locations_string=&country%5B%5D=44&country%5B%5D=s2&country%5B%5D=s10&country%5B%5D=s1&country%5B%5D=s7&country%5B%5D=s11&country%5B%5D=s9&country%5B%5D=s3&country%5B%5D=s6&country%5B%5D=s14&country%5B%5D=s15&country%5B%5D=s16&country%5B%5D=s13&country%5B%5D=s4&country%5B%5D=s8&country%5B%5D=s12&country%5B%5D=s5&country%5B%5D=8&country%5B%5D=s22&country%5B%5D=s24&country%5B%5D=s25&country%5B%5D=s19&country%5B%5D=s20&country%5B%5D=s23&country%5B%5D=s17&country%5B%5D=s18&country%5B%5D=s21&sort=_score&_pn=30" "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2228.0 Safari/537.36"
13:35:29.795 [LogStash::Runner] WARN  logstash.agent - stopping pipeline {:id=>"main"}
2017-01-05T12:35:29.445Z ELK 136.243.152.18 - - [03/Jan/2017:06:25:10 +0100] "GET /?pdf_print=1 HTTP/1.1" 200 19975 "-" "Mozilla/5.0 (compatible; MJ12bot/v1.4.7; http://mj12bot.com/)"
2017-01-05T12:35:29.464Z ELK 207.46.13.119 - - [03/Jan/2017:06:25:13 +0100] "GET /en/jobs/biology-life-sciences/editorial-staff-public-relations/weiterbildung-fuer-hochschulabsolventen-seminar-online-redakteur-schwerpunkt-new-media-m-w-75374.html HTTP/1.1" 200 19593 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"

(...terminal is scrolling the whole logfile...)

on Kibana i use filebeat-*

in addition i didn't get this warning message before. I am wondering about that, because the mentioned logstash.yml file is in this folder. Can't use the --verbose and --debug modus.

System is a virtualmachine on virtualbox with Ubuntu 16.04 Server.

What did i wrong?


(Mark Walkom) #2

Is there anything in stdout?


(Patrick Grzechca) #3

I am sorry but how can i read stdout? I am trying to get through the video's and webinars but it is still very new for me.

I managed now to send the logfile via logstash to kibana. I used only a simple logfile and included the password and username. I still don't know if they are nessercary. In this Tutorial
they created an account for the kibana gui but they didn't mentioned it in the logstash config.
My config looks like this now:

input {
      file {
        path => "/home/grzechca/logfolder/jobvector_de_access.*"
        type => "apache-access"
        start_position => "beginning"
        ignore_older => 0
        sincedb_path => "/dev/null"
         }
    }

    output {

        elasticsearch {
                 hosts => "localhost"
                 user => "jblabla"
                 password => "blablalo"
        }
        stdout { }

        }

I know i am missing the filters, but first i have to understand how they work and for this purpose i need to read a bit more.

Another point of failing was:
In Kibana, i didn't use the logstash-* pattern, only filebeat. Something i don't understand:
If i am using Filebeat input for logstash from Filebeat Server-Clients, why does kibana need a filebeat-* pattern instead of logstash-* like for the logfile before?

I am still receiving this kind of message before logstash start, but after few forumposts i guess i can ignore it:

    WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs to console

(Mark Walkom) #4

You've already got it :slight_smile: If you start LS via the console (ie bin/logstash -f configfile.json does it output the events?


(Patrick Grzechca) #5

if i search with >ls< the .../bin/logstash or /etc/logstash/conf.d/ folders i can't see any .jsons files regarding to my apache_to_elastic01.conf. But i see a "filebeat-index-template.json" file. But i can't open it with vim.
did i understand you correct? used this command:
ls /usr/share/bin/logstash/ -f

had following output:
.bash_history .profile
.bash_logout .rnd
.bashrc sampleconfs/
beats-dashboards-1.2.2/ .sincedb_b372c8f6d56173bef7066aa19b392fcd
beats-dashboards-1.2.2.zip .sincedb_b81a61a47155db8e959e7f99421cd68f
beats-dashboards-5.1.1/ .sincedb_bfd507fdb47cfc64b00aa95eed0082a0
.cache/ .sincedb_e3e1c57c918001264e50f1f7099d7f09
filebeat-5.0.0-amd64.deb .ssh/
filebeat-index-template.json .sudo_as_admin_successful
import_dashboards.go .vim/
logfolder/ .viminfo
.oracle_jre_usage/ .vimrc


(Mark Walkom) #6

Sorry, that was a typo, you need your apache_to_elastic01.conf file.


(Patrick Grzechca) #7

Hello Warkolm!
After reconfiguring the config file, logstash is showing all the events in the terminal.

In addition, kibana ist showing the logs and i can start do work with it.

Thanks for your help!


(Troy C) #8

I'm experiencing a rather similar issues, and maybe I should repost. I'm doing this with a local nginx file with the input plugin. The crazy thing is if I test the file doing /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/local-nginx.conf then the terminal outputs what I expect, and I'll see it in Kibana (and therefore am certain it got into elasticsearch). However, if I run logstash as a service it won't work. You would think the logstash service is broken, but it seems to run ok, and it will correctly run other config files if they're in my /etc/logstash/conf

Any ideas?


(Troy C) #9

This is what I end up getting after I enable debug for logstash log

[2017-01-12T13:48:25,160][INFO ][logstash.pipeline ] Pipeline main started
[2017-01-12T13:48:25,169][DEBUG][logstash.agent ] Starting puma
[2017-01-12T13:48:25,171][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2017-01-12T13:48:25,172][DEBUG][logstash.api.service ] [api-service] start
[2017-01-12T13:48:25,219][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-01-12T13:48:25,770][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2017-01-12 13:48:25 +0000}
[2017-01-12T13:48:26,776][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2017-01-12 13:48:26 +0000}
[2017-01-12T13:48:27,780][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2017-01-12 13:48:27 +0000}
[2017-01-12T13:48:28,784][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2017-01-12 13:48:28 +0000}
[2017-01-12T13:48:29,789][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2017-01-12 13:48:29 +0000}
[2017-01-12T13:48:30,162][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-01-12T13:48:30,795][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2017-01-12 13:48:30 +0000}
[2017-01-12T13:48:31,799][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2017-01-12 13:48:31 +0000}
[2017-01-12T13:48:32,809][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2017-01-12 13:48:32 +0000}
[2017-01-12T13:48:33,812][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2017-01-12 13:48:33 +0000}
[2017-01-12T13:48:34,816][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2017-01-12 13:48:34 +0000}
[2017-01-12T13:48:35,164][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-01-12T13:48:35,820][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2017-01-12 13:48:35 +0000}
[2017-01-12T13:48:36,823][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2017-01-12 13:48:36 +0000}
[2017-01-12T13:48:36,970][DEBUG][logstash.inputs.file ] _globbed_files: /var/log/nginx/access.log: glob is: []
[2017-01-12T13:48:37,827][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2017-01-12 13:48:37 +0000}
[2017-01-12T13:48:38,833][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2017-01-12 13:48:38 +0000}
[2017-01-12T13:48:39,837][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2017-01-12 13:48:39 +0000}


(system) #10

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.