aabababba  
                (aabababba)
               
                 
              
                  
                    October 30, 2018,  1:11am
                   
                   
              1 
               
             
            
              1 
/var/log/message
Oct 30 09:04:35 ci04 dbus-daemon: dbus[616]: [system] Successfully activated service 'org.freedesktop.problems' 
2 
/var/log/tomcat/catalina.out
[INFO ] [2018-10-29 16:44:30,945] [DubboServerHandler-172.1.0.53:20880-thread-38] [RedisConnManager:109] [181029000400931071] [181029000400931071]- [reids:1]
anyidea?thanks...
             
            
               
               
               
            
            
           
          
            
              
                jsoriano  
                (Jaime Soriano)
               
              
                  
                    October 30, 2018,  2:42pm
                   
                   
              2 
               
             
            
              Hi @aabababba ,
I'm not sure of understanding your question. For log messages you can use the system module . For tomcat there is no module yet, for it you could use just a log input , or create your own pipeline  to parse these lines. If you do it you can consider creating a new filebeat module  and contributing it 
             
            
               
               
               
            
            
           
          
            
              
                aabababba  
                (aabababba)
               
              
                  
                    October 31, 2018,  2:10am
                   
                   
              3 
               
             
            
              my filebeat.yml
 filebeat.prospectors:
- type: log
  enabled: true
  paths:
    - /var/log/*.log
- type: log
  enabled: true
  paths:
    - /opt/apache-tomcat/logs/*
  tags: ["catalina"]
  document_type: catalina
  fields:
     type: catalina
  fields_under_root: true
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false
setup.template.settings:
  index.number_of_shards: 3
setup.kibana:
output.logstash:
  hosts: ["172.16.0.2:5044"]
 
my logstash.yml
   input {
       beats {
       add_field => {"beatType" => "metricbeat"}
        port => "5043"
       }
        beats {
        add_field => {"beatType" => "filebeat"}
         port => "5044"
        }
}
filter {
    mutate {
       add_field => { "remote_ip" => "%{[@metadata][ip_address]}" }
    }
  
   if [type]=="catalina"{
   grok {
        match => {
            "message" => "\[%{LOGLEVEL:loglevel}\]  \[%{TIMESTAMP_ISO8601:access_time}\] \[%{DATA:exception_info}\] - \[%{GREENYDATA:msg}\]"
        }
        pattern_definitions => {
            "MESSAGE" => "[\s\S]*"
        }
    }
    date {
        match => [ "access_time","yyyy-MM-dd HH:mm:ss,SSS" ]
    }
    mutate {
        remove_field => ["access_time","[message][0]"]
    }
   }
}
 
output {
if [beatType] == "metricbeat" {
        
  elasticsearch {
        hosts => ["172.16.0.2:9200"]
    index => "metricbeat-%{+YYYY.MM.dd}"
    }
}
if [beatType] == "filebeat" {
        
  elasticsearch {
        hosts => ["172.16.0.2:9200"]
    index => "filebeat-%{+YYYY.MM.dd}"
    }
}  
   
}
 
this not work, loglevel ->info error warn debug can't index...
             
            
               
               
               
            
            
           
          
            
              
                jsoriano  
                (Jaime Soriano)
               
              
                  
                    October 31, 2018, 12:49pm
                   
                   
              4 
               
             
            
              What is exactly not working?
In your config I see that field type is set to catalina, but the grok pattern is applied if [Type]=="tomcat".
I also see you are using two different ports for beats input, this is fine but if you are doing it only to add a field to be able to separate events in different indexes, take into account that you can also use @metadata in the index name:
output {
        elasticsearch {
                hosts => ["172.16.0.2:9200"]
                index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
        }
}
 
Notice that it is recommended to version the indexes, so you can have multiple beats versions using different index templates.
             
            
               
               
               
            
            
           
          
            
              
                aabababba  
                (aabababba)
               
              
                  
                    November 1, 2018,  1:21am
                   
                   
              5 
               
             
            
              thank you. I change the if [Type]=="catalina",but still not work,the issue is I set the log-level keyword,and '[INFO ]' this can't show in the log-level. 
sorry, my English just so so...
             
            
               
               
               
            
            
           
          
            
              
                jsoriano  
                (Jaime Soriano)
               
              
                  
                    November 1, 2018, 10:38am
                   
                   
              6 
               
             
            
              You are setting the type all lowercase, try to use it also with lowercase in the condition if [type]=="catalina".
Regarding the grok pattern, what kind of log lines do you want to match? 
The pattern you are writing starts with timestamp:
%{TIMESTAMP_ISO8601:access_time} %{LOGLEVEL:loglevel} [%{DATA:exception_info}] - <%{MESSAGE:message}>
 
But the catalina log line you pasted at the begining starts with the log level:
[INFO ] [2018-10-29 16:44:30,945] [DubboServerHandler-172.1.0.53:20880-thread-38] [RedisConnManager:109] [181029000400931071] [181029000400931071]- [reids:1]
 
These tools can help you to build or test grok patterns:
             
            
               
               
               
            
            
           
          
            
              
                aabababba  
                (aabababba)
               
                 
              
                  
                    November 2, 2018,  2:46am
                   
                   
              7 
               
             
            
              sorry,my mistake,but I change it like this,still not work,loglevel and msg do not have data.
"message" =>  " \[%{LOGLEVEL:loglevel} \] \[%{TIMESTAMP_ISO8601:logtime}\] \[%{DATA:exception_info}\] \[%{DATA:exception_info1}\] \[%{DATA:exception_info2}\] \[%{DATA:exception_info3}\]- \[%{DATA:msg}\]"
 
             
            
               
               
               
            
            
           
          
            
              
                jsoriano  
                (Jaime Soriano)
               
              
                  
                    November 2, 2018, 11:30am
                   
                   
              8 
               
             
            
              It seems that your grok pattern is not able to parse this log line, if you look at the tags, the _grokparsefailure one has been added, this happens when grok patterns are not able to parse the log line.
             
            
               
               
               
            
            
           
          
            
              
                aabababba  
                (aabababba)
               
              
                  
                    November 5, 2018,  5:34am
                   
                   
              9 
               
             
            
              thanks you very much,I get it, grok mistake!
             
            
               
               
               
            
            
           
          
            
              
                system  
                (system)
                  Closed 
               
              
                  
                    December 3, 2018,  7:35am
                   
                   
              10 
               
             
            
              This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.