Configuring Logstash to parse Apache log files and send

I'm trying to parse xampp log files and to send them to ES, i found a configuration example:

input{
	file{
		path => '/opt/lampp/logs/*.log'
		start_position => beginning
	}
}
filter {  
  if [path] =~ "access" {
    mutate { replace => { "type" => "apache_access" } }
    grok {
      # Depends on the format of your log file
      match => { "message" => "%{COMBINEDAPACHELOG}" }
    }
  } else if [path] =~ "error" {
    mutate { replace => { type => "apache_error" } }
  }
  date {
    match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
    locale => "en"
  }
}

output{
	elasticsearch {
       index => "apache_logs"
       document_type => "apache_log"
       host => "localhost"
    }
    file { path => "/home/logstash.out"}
}

But , it sounds not working :disappointed:

  • logstash.err content:

    mai 09, 2016 11:01:50 PM org.elasticsearch.node.internal.InternalNode
    INFOS: [logstash-fathi-HP-Pavilion-g6-Notebook-PC-20729-13966] version[1.7.0], pid[20729], build[929b973/2015-07-16T14:31:07Z]
    mai 09, 2016 11:01:50 PM org.elasticsearch.node.internal.InternalNode
    INFOS: [logstash-fathi-HP-Pavilion-g6-Notebook-PC-20729-13966] initializing ...
    mai 09, 2016 11:01:50 PM org.elasticsearch.plugins.PluginsService
    INFOS: [logstash-fathi-HP-Pavilion-g6-Notebook-PC-20729-13966] loaded [], sites []
    mai 09, 2016 11:01:53 PM org.elasticsearch.bootstrap.Natives
    AVERTISSEMENT: JNA not found. native methods will be disabled.
    mai 09, 2016 11:01:54 PM org.elasticsearch.node.internal.InternalNode
    INFOS: [logstash-fathi-HP-Pavilion-g6-Notebook-PC-20729-13966] initialized
    mai 09, 2016 11:01:54 PM org.elasticsearch.node.internal.InternalNode start
    INFOS: [logstash-fathi-HP-Pavilion-g6-Notebook-PC-20729-13966] starting ...
    mai 09, 2016 11:01:54 PM org.elasticsearch.transport.TransportService doStart
    INFOS: [logstash-fathi-HP-Pavilion-g6-Notebook-PC-20729-13966] bound_address {inet[/0:0:0:0:0:0:0:0:9301]}, publish_address {inet[/192.168.8.100:9301]}
    mai 09, 2016 11:01:55 PM org.elasticsearch.discovery.DiscoveryService doStart
    INFOS: [logstash-fathi-HP-Pavilion-g6-Notebook-PC-20729-13966] elasticsearch/xWe7YflqTvikfOQukRad9w
    mai 09, 2016 11:01:58 PM org.elasticsearch.cluster.service.InternalClusterService$UpdateTask run
    INFOS: [logstash-fathi-HP-Pavilion-g6-Notebook-PC-20729-13966] detected_master [Uncle Ben Parker][uBONhZmdQt2EaG5MXHnJqw][fathi-HP-Pavilion-g6-Notebook-PC][inet[/127.0.0.1:9300]], added {[Uncle Ben Parker][uBONhZmdQt2EaG5MXHnJqw][fathi-HP-Pavilion-g6-Notebook-PC][inet[/127.0.0.1:9300]],}, reason: zen-disco-receive(from master [[Uncle Ben Parker][uBONhZmdQt2EaG5MXHnJqw][fathi-HP-Pavilion-g6-Notebook-PC][inet[/127.0.0.1:9300]]])
    mai 09, 2016 11:01:58 PM org.elasticsearch.node.internal.InternalNode start
    INFOS: [logstash-fathi-HP-Pavilion-g6-Notebook-PC-20729-13966] started

  • My access_log file

    127.0.0.1 - - [27/Feb/2016:23:18:53 +0100] "GET /phpmyadmin/themes/pmahomme/img/logo_left.png HTTP/1.1" 200 2327
    127.0.0.1 - - [27/Feb/2016:23:18:53 +0100] "GET /phpmyadmin/themes/dot.gif HTTP/1.1" 200 43
    127.0.0.1 - - [27/Feb/2016:23:18:53 +0100] "GET /phpmyadmin/js/get_scripts.js.php?scripts%5B%5D=jquery/jquery-1.11.1.min.js&scripts%5B%5D=sprintf.js&scripts%5B%5D=ajax.js&scripts%5B%5D=keyhandler.js&scripts%5B%5D=jquery/jquery-ui-1.11.2.min.js&scripts%5B%5D=jquery/jquery.cookie.js&scripts%5B%5D=jquery/jquery.mousewheel.js&scripts%5B%5D=jquery/jquery.event.drag-2.2.js&scripts%5B%5D=jquery/jquery-ui-timepicker-addon.js&scripts%5B%5D=jquery/jquery.ba-hashchange-1.3.js&scripts%5B%5D=jquery/jquery.debounce-1.0.5.js&scripts%5B%5D=menu-resizer.js&scripts%5B%5D=cross_framing_protection.js&scripts%5B%5D=rte.js&scripts%5B%5D=tracekit/tracekit.js&scripts%5B%5D=error_report.js&scripts%5B%5D=doclinks.js&scripts%5B%5D=functions.js&scripts%5B%5D=navigation.js&scripts%5B%5D=indexes.js&v=4.5.2 HTTP/1.1" 200 216077
    127.0.0.1 - - [27/Feb/2016:23:18:53 +0100] "GET /phpmyadmin/js/get_image.js.php?theme=pmahomme&v=4.5.2 HTTP/1.1" 200 1822
    127.0.0.1 - - [27/Feb/2016:23:18:53 +0100] "GET /phpmyadmin/js/get_scripts.js.php?scripts%5B%5D=common.js&scripts%5B%5D=config.js&scripts%5B%5D=page_settings.js&scripts%5B%5D=codemirror/lib/codemirror.js&scripts%5B%5D=codemirror/mode/sql/sql.js&scripts%5B%5D=codemirror/addon/runmode/runmode.js&scripts%5B%5D=codemirror/addon/hint/show-hint.js&scripts%5B%5D=codemirror/addon/hint/sql-hint.js&scripts%5B%5D=codemirror/addon/lint/lint.js&scripts%5B%5D=codemirror/addon/lint/sql-lint.js&scripts%5B%5D=console.js&v=4.5.2 HTTP/1.1" 200 129429

What isn't working? Which version of Logstash is this? Elasticsearch 1.7.0 is quite old by now. You should upgrade.

Comment out the elasticsearch output and use a simple stdout { codec => rubydebug } output until you've verified that the logs are parsed as desired.

I'm using Logstash 2.2.2 with ES 1.7.1. Unfortunately, i can't upgrade my ES, cause i have some stuff, which do not work with recent ES versions. Which logstash version can do this job and which i can use with ES 1.7.1?

But, it's not working even when commenting ES ouput, actually, when using rubydebug, it prints

Logstash startup completed

And it still blocked for several minutes

Logstash is probably waiting for more data to be added to the log file. Delete the sincedb file to make sure the file is scanned from the beginning and check if the log file is older than 24 hours. If yes, either touch the file or adjust the file input's `ignore_older' option.

I tried this with a file, i just have created and i still getting the same behaviour