seeva  
                (KARUPPANNAN)
               
                 
                 
              
                  
                    October 25, 2017,  9:29am
                   
                   
              1 
               
             
            
              Hi,
I am using filebeat,logstash,ES and kibana 5.6.3. I want to be able to see the log source file name in kibana, but i am not being able to to. i have all information but do not have the file source name , the log are coming from. Please can anyone help me.
In kibana i want a line "file"=> log-12.01.2000.log for example for each log. here is my logstach conf file.
input { 
beats { 
port => 5044 
host => "localhost" 
} 
}
filter { 
#(?<queue_id>[0-9A-F]{10,11}) 
#%{TIMESTAMP_ISO8601:FixedFormatISOInternationalDate}
grok { 
match => { "message" => ["(?(\d{4})-(\d{2})-(\d{2}).(\d{2}):(\d{2}):(\d{2}).(\d{3}))	%{INT:EventId}	%{UUID:ActivityId}	%{DATA:UserName}	%{NOTSPACE:TransactionIsolationLevel}	%{NOTSPACE:TransactionLocalIdentifier}	%{NOTSPACE:TransactionDistributedIdentifier}	%{NOTSPACE:TransactionStatus}	%{NOTSPACE:severity	}	%{GREEDYDATA:data}"] } 
}
mutate { 
add_field => { 
"[@metadata ][Source]" => "%{source}" 
"[@metadata ][Input_type]" => "%{input_type}" 
"[@metadata ][Timestamp]" => "%{@timestamp }" 
"[@metadata ][Tags]" => "%{tags}" 
} 
}
mutate { 
remove_field => ["message", "source"] 
} 
}
output { 
if "_grokparsefailure" not in [tags] { 
elasticsearch { 
hosts => ["10.184.161.66","10.184.161.67"] 
user => elastic 
password => "Cs_24Z*-;u3WXMzwk]66" 
codec => json 
index => opera_index 
} 
}
}
-- what i have in kibana see attachment.
Thank you.
             
            
               
               
               
            
            
           
          
            
            
              The filename is in the source field that you're removing.
             
            
               
               
               
            
            
           
          
            
              
                seeva  
                (KARUPPANNAN)
               
              
                  
                    October 25, 2017, 10:12am
                   
                   
              3 
               
             
            
              Thank you mahnusbaek, i am going to try this. must i delete  the index or il will just get update with this field automatically ? thank you.
             
            
               
               
               
            
            
           
          
            
            
              All new events will get the source field.
             
            
               
               
               
            
            
           
          
            
              
                seeva  
                (KARUPPANNAN)
               
              
                  
                    October 25, 2017, 11:35am
                   
                   
              5 
               
             
            
              magnusbaeck, i still cannot see the new field named "source", i have restarted the logstash and filebeat services.
             
            
               
               
               
            
            
           
          
            
            
              Remove all your current filters and use a stdout { codec => rubydebug } output to dump all incoming events to the Logstash log. What does an example event look like?
             
            
               
               
               
            
            
           
          
            
              
                seeva  
                (KARUPPANNAN)
               
              
                  
                    October 25, 2017,  1:11pm
                   
                   
              7 
               
             
            
              what do you think about this :
input { stdin { } }
filter { 
grok { 
match => { "message" => ["(?(\d{4})-(\d{2})-(\d{2}).(\d{2}):(\d{2}):(\d{2}).(\d{3}))	%{INT:EventId}	%{UUID:ActivityId}	%{DATA:UserName}	%{NOTSPACE:TransactionIsolationLevel}	%{NOTSPACE:TransactionLocalIdentifier}	%{NOTSPACE:TransactionDistributedIdentifier}	%{NOTSPACE:TransactionStatus}	%{NOTSPACE:severity	}	%{GREEDYDATA:data}"] } 
} 
date { 
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ] 
} 
}
output { 
elasticsearch { 
hosts => ["10.184.161.66","10.184.161.67"] 
user => elastic 
password => "Cs_24Z*-;u3WXMzwk]66" 
index => opera_index
}
stdout { codec => rubydebug } 
}
here is an event in a log file :
2017-09-05 01:00:08.092	131	00000000-0000-0000-0000-000000000000	user	(null)	(null)	(null)	(null)	Information	this is a message
             
            
               
               
               
            
            
           
          
            
            
              Keep your beats input. It's the Filebeat messages we want to look at.
2017-09-05 01:00:08.092	131	00000000-0000-0000-0000-000000000000	user	(null)	(null)	(null)	(null) Information	this is a message
 
I want to see the messages processed by Logstash, written to the Logstash log. Not the input messages.
             
            
               
               
              1 Like 
            
            
           
          
            
              
                seeva  
                (KARUPPANNAN)
               
              
                  
                    October 30, 2017,  9:47am
                   
                   
              9 
               
             
            
              Hi magnusbaek, your recommandation of removing "source" in the remove_field 's line works. I just have to wait some time to see the source field. thank you the problem is resolved.
best.
             
            
               
               
               
            
            
           
          
            
              
                system  
                (system)
                  Closed 
               
              
                  
                    November 27, 2017,  9:48am
                   
                   
              10 
               
             
            
              This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.