sud0  
                (Luke)
               
                 
                 
              
                  
                    October 9, 2019,  1:01am
                   
                   
              1 
               
             
            
              I have two Apache webservers (1x uat and 1x prod). I am sending the logs to my ELK setup for both of them.
For the UAT one, it is working fine. 
For the PROD one, the logs are now shown in Kibana.
I need some help to debug it, and find out where the issue is. I'll post some config files below so you guys have an idea how this is setup.
The LogFormat config on the httpd.conf is the same for both environments. 
PROD Apache version: Apache/2.2.15 (Unix) 
UAT Apache version: Apache/2.2.9 (Unix) 
 
/etc/logstash/conf.d/logstash.conf:
input {
## PROD
file {
        type => "apache_access_log"
        start_position => "beginning"
        path => "/mnt/logs/web/access_log"
    }
## UAT
file {
        type => "uat_apache_access_log"
        start_position => "beginning"
        path => "/mnt/logs/uatweb/access_log"
    }
}
filter {
    # Remove unwanted carrage returns, global to all filter types
    mutate {
            gsub => [ 'message', "\r", '' ]
    }
    ######################################################
# PROD
# Apache access filter
    if [type] == "apache_access_log" {
        mutate {
            replace => { 'host' => 'webserver.datacentre.example.com' }
            add_field => { 'environment' => 'production'
                           'service' => 'apache_access'
            }
        }
        grok {
            match => {
                "message" => "%{IPORHOST:clientip}%{SPACE}\[%{HTTPDATE:timestamp}\]%{SPACE}%{NUMBER:port}%{SPACE}%{WORD:method}%{SPACE}%{URIPATHPARAM:request_uri}%{SPACE}%{NOTSPACE}%{SPACE}%{NUMBER:status_code}%{SPACE}%{NOTSPACE:bytes_delivered}%{SPACE}%{NUMBER:duration%}%{SPACE}(?:%{URI:referrer}|.*)%{SPACE}%{QS:agent}%{SPACE}%{GREEDYDATA:general_data}"
            }
        }
        date {
            match => ["timestamp", "dd/MMM/yyyy:HH:mm:ss Z"]
            target => "@timestamp"
        }
    }
# UAT
# Apache access filter
    if [type] == "uat_apache_access_log" {
        mutate {
            replace => { 'host' => 'uatweb.datacentre.example.com' }
            add_field => { 'environment' => 'uat'
                           'service' => 'apache_access'
            }
        }
        grok {
            match => {
                "message" => "%{IPORHOST:clientip}%{SPACE}\[%{HTTPDATE:timestamp}\]%{SPACE}%{NUMBER:port}%{SPACE}%{WORD:method}%{SPACE}%{URIPATHPARAM:request_uri}%{SPACE}%{NOTSPACE}%{SPACE}%{NUMBER:status_code}%{SPACE}%{NOTSPACE:bytes_delivered}%{SPACE}%{NUMBER:duration%}%{SPACE}(?:%{URI:referrer}|.*)%{SPACE}%{QS:agent}%{SPACE}%{GREEDYDATA:general_data}"
            }
        }
        date {
            match => ["timestamp", "dd/MMM/yyyy:HH:mm:ss Z"]
            target => "@timestamp"
        }
    }
output {
elasticsearch {
    hosts => ["localhost:9200"]
    # Weekly index (for pruning)
    index => "mw-log-index-%{+YYYY.'w'ww}"
}
stdout { codec => rubydebug }
}
 
On KIbana, there is NO logs for PROD:
... however for UAT there are: 
/var/log/logstash/logstash-plain.log:
[2019-10-09T13:45:04,253][DEBUG][org.logstash.config.ir.CompiledPipeline][main] Compiled filter
 P[filter-mutate{"replace"=>{"host"=>"webserver.datacentre.example.com"}, "add_field"=>{"environment"=>"production", "service"=>"apache_access"}}|[str]pipeline:209:9:```
mutate {
            replace => { 'host' => 'webserver.datacentre.example.com' }
            add_field => { 'environment' => 'production'
                           'service' => 'apache_access'
            }
        }
```]
 
How can I troubleshoot this? where to start looking?
             
            
               
               
               
            
            
           
          
            
              
                Badger  
                
               
              
                  
                    October 9, 2019,  2:22pm
                   
                   
              2 
               
             
            
              In the UAT section you are testing
if [type] == "apache_access_log"
 
which looks wrong to me. Should that be looking for "uat_apache_access_log"?
             
            
               
               
               
            
            
           
          
            
              
                sud0  
                (Luke)
               
              
                  
                    October 9, 2019,  8:32pm
                   
                   
              3 
               
             
            
              Sorry my bad.... I copied it wrong. I've fixed the post. (it is already like uat_apache_access_log.
             
            
               
               
               
            
            
           
          
            
              
                Badger  
                
               
              
                  
                    October 9, 2019,  8:38pm
                   
                   
              4 
               
             
            
              OK, try filtering for NOT environment: uat and see if you can find the missing data. Also, try looking across a much longer period of time.
             
            
               
               
               
            
            
           
          
            
              
                sud0  
                (Luke)
               
                 
              
                  
                    October 9, 2019,  8:41pm
                   
                   
              5 
               
             
            
              Ok Thanks for your suggestion... but nothing new!
             
            
               
               
               
            
            
           
          
            
              
                Badger  
                
               
              
                  
                    October 9, 2019,  8:49pm
                   
                   
              6 
               
             
            
              It sounds as though the file input is not reading the file. Enable tracing as described in this  post and see what filewatch has to say.
             
            
               
               
               
            
            
           
          
            
              
                sud0  
                (Luke)
               
              
                  
                    October 9, 2019,  9:05pm
                   
                   
              7 
               
             
            
              Done! Let me know if you need more log:
[2019-10-10T09:54:27,331][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@start_position = "beginning"
[2019-10-10T09:54:27,331][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@path = ["/mnt/logs/web/access_log"]
[2019-10-10T09:54:27,331][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@id = "c3b85ae40ef876422eb8f30486cf9828a2903d75d01bb103edbe6da301cc4f38"
[2019-10-10T09:54:27,331][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@type = "apache_access_log"
[2019-10-10T09:54:27,331][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@enable_metric = true
[2019-10-10T09:54:27,336][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@codec = <LogStash::Codecs::Plain id=>"plain_1f0e86ee-70e6-4bc1-b9ca-ad67c2c7c570", enable_metric=>true, charset=>"UTF-8">
[2019-10-10T09:54:27,337][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@add_field = {}
[2019-10-10T09:54:27,337][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@stat_interval = 1.0
[2019-10-10T09:54:27,337][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@discover_interval = 15
[2019-10-10T09:54:27,337][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@sincedb_write_interval = 15.0
[2019-10-10T09:54:27,337][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@delimiter = "\n"
[2019-10-10T09:54:27,337][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@close_older = 3600.0
[2019-10-10T09:54:27,337][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@mode = "tail"
[2019-10-10T09:54:27,337][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@file_completed_action = "delete"
[2019-10-10T09:54:27,337][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@sincedb_clean_after = 1209600.0
[2019-10-10T09:54:27,337][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@file_chunk_size = 32768
[2019-10-10T09:54:27,337][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@file_chunk_count = 140737488355327
[2019-10-10T09:54:27,337][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@file_sort_by = "last_modified"
[2019-10-10T09:54:27,337][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@file_sort_direction = "asc"
[2019-10-10T09:54:27,353][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@id = "plain_7e9cf891-01b9-400c-8e93-087db79f785e"
[2019-10-10T09:54:27,354][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@enable_metric = true
[2019-10-10T09:54:27,354][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2019-10-10T09:54:35,199][DEBUG][org.logstash.config.ir.CompiledPipeline][main] Compiled conditional
 [if (event.getField('[type]')=='apache_access_log')]
 into
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@56d811ee
[2019-10-10T09:54:35,207][DEBUG][org.logstash.config.ir.CompiledPipeline][main] Compiled conditional
 [if (event.getField('[type]')=='apache_access_log')]
 into
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@56d811ee
[2019-10-10T09:54:35,217][DEBUG][org.logstash.config.ir.CompiledPipeline][main] Compiled conditional
 [if (event.getField('[type]')=='apache_access_log')]
 into
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@56d811ee
[2019-10-10T09:54:35,232][DEBUG][org.logstash.config.ir.CompiledPipeline][main] Compiled conditional
 [if (event.getField('[type]')=='apache_access_log')]
 into
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@56d811ee
[2019-10-10T09:54:35,249][DEBUG][org.logstash.config.ir.CompiledPipeline][main] Compiled conditional
 [if (event.getField('[type]')=='apache_access_log')]
 into
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@56d811ee
[2019-10-10T09:54:35,278][DEBUG][org.logstash.config.ir.CompiledPipeline][main] Compiled filter
 P[filter-mutate{"replace"=>{"host"=>"webserver.datacentre.example.com"}, "add_field"=>{"environment"=>"production", "service"=>"apache_access"}}|[str]pipeline:209:9:```
mutate {
            replace => { 'host' => 'webserver.datacentre.example.com' }
            add_field => { 'environment' => 'production'
                           'service' => 'apache_access'
            }
        }
```]
 into
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@6ccb8168 
             
            
               
               
               
            
            
           
          
            
              
                Badger  
                
               
              
                  
                    October 9, 2019,  9:45pm
                   
                   
              8 
               
             
            
              We need the TRACE messages from filewatch, which look like this
[2019-07-30T13:18:09,252][TRACE][filewatch.tailmode.processor] Delayed Delete processing
[2019-07-30T13:18:09,267][TRACE][filewatch.tailmode.processor] Watched + Active restat processing
[2019-07-30T13:18:09,297][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2019-07-30T13:18:09,358][TRACE][filewatch.tailmode.processor] Rotation In Progress processing
 
There should be a lot of them.
             
            
               
               
               
            
            
           
          
            
              
                sud0  
                (Luke)
               
              
                  
                    October 9, 2019, 10:16pm
                   
                   
              9 
               
             
            
              
 Badger:
 
filewatch
 
 
You can grab the log file here . 
Cheers!
             
            
               
               
               
            
            
           
          
            
              
                Badger  
                
               
              
                  
                    October 9, 2019, 11:08pm
                   
                   
              10 
               
             
            
              The string /web/access_log never occurs in that file, which tells me that logstash never sees the file. Are you sure that the name is right and that logstash has execute access to that directory?
             
            
               
               
              1 Like 
            
            
           
          
            
              
                sud0  
                (Luke)
               
              
                  
                    October 9, 2019, 11:21pm
                   
                   
              11 
               
             
            
              You're right.
The path /mnt/logs/web/ was mounted as nobody:nobody, therefore, the user logstash did not have permission.
Adding the Domain = localdomain config in the /etc/idmapd.conf file and re-mounting the NFS volume fixed my problem.
             
            
               
               
               
            
            
           
          
            
              
                system  
                (system)
                  Closed 
               
              
                  
                    November 6, 2019, 11:21pm
                   
                   
              12 
               
             
            
              This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.