ahaw021  
                (Andrei)
               
                 
                 
              
                  
                    October 16, 2017,  7:21am
                   
                   
              1 
               
             
            
              Hi All
I hope I am having a 1d10t error but thought I would ask anyway.
I have downloaded a bunch of CSV files which sit on a folder (in my case: E:\GITHUB\SPLUNK-ELK-CERTIFICATE-TRANSPARENCY\playground\NEWBIE)
I have configured logstash using the parameters below (CSV plugin is installed)
> input {
>   file {
>     path => "E:\GITHUB\SPLUNK-ELK-CERTIFICATE-TRANSPARENCY\playground\NEWBIE\*.csv"
> 	ignore_older => 0
>    }
> }
> filter {
>   csv {
>       separator => ","
>       columns => ["ct_logname","cert_index","chain_hash","cert_der","all_domains","not_before","not_after"]
>   }
> }
> output {
>    elasticsearch {
>      hosts => "http://localhost:9200"
>      index => "certificates"
> 	 }
> 	stdout {}
> }
 
if i run this without debug i get the following happening
If i run with debug I get the following error messages
no index is created:
             
            
               
               
               
            
            
           
          
            
              
                ahaw021  
                (Andrei)
               
                 
              
                  
                    October 16, 2017,  7:25am
                   
                   
              2 
               
             
            
              if i then change the config file to take things from stdin
I have no issues
> input {
> stdin{}
> 
> }
> filter {
>   csv {
>       separator => ","
>       columns => ["ct_logname","cert_index","chain_hash","cert_der","all_domains","not_before","not_after"]
>   }
> }
> output {
>    elasticsearch {
>      hosts => "http://localhost:9200"
>      index => "certificates"
> 	 }
> 	stdout {}
> }
 
I think it's something wrong with the file/folder syntax but i can't figure out what
I am suspecting it's not the CSV parser as it's working as expected with STDIN input
             
            
               
               
               
            
            
           
          
            
              
                ahaw021  
                (Andrei)
               
              
                  
                    October 17, 2017,  7:37am
                   
                   
              3 
               
             
            
              Solved - it was a sincedb related issue
How i fixed it - put a static path to a file and deleted the contents before a run
input {
  file {
    path => ["E:\GITHUB\SPLUNK-ELK-CERTIFICATE-TRANSPARENCY\playground\NEWBIE\*.csv"]
	ignore_older => 0
	start_position => "beginning"
	sincedb_path => "path/sincedb.txt"
   }
}
filter {
  csv {
      separator => ","
      columns => ["ct_logname","cert_index","chain_hash","cert_der","all_domains","not_before","not_after"]
  }
}
output {
   elasticsearch {
     index => "certificates_better"
	 }
	stdout {}
} 
             
            
               
               
               
            
            
           
          
            
              
                system  
                (system)
                  Closed 
               
              
                  
                    November 14, 2017,  7:38am
                   
                   
              4 
               
             
            
              This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.