Index fields concatenated together?

I recently added 2 new .CONF files to my /etc/logstash/conf.d/ directory to ingest data into 2 new indices, in addition to 1 existing index. When I view the 2 new indices in Kibana, it appears that the 2 new indices contain all columns from all indices. This creates obvious problems when I try to import data: data from index 2 does not map to index 3, much less indices 1 and 3 simultaneously.

I've checked the CONF files for obvious mistakes like missed braces, parentheses, commas. I have also searched these forums and not found any similar topics.

Any advice on how to troubleshoot this issue?

Thanks,
Cyrus

I recently added 2 new .CONF files to my /etc/logstash/conf.d/ directory to ingest data into 2 new indices, in addition to 1 existing index.

What do the elasticsearch outputs in these two files look like?

When I view the 2 new indices in Kibana, it appears that the 2 new indices contain all columns from all indices.

Not sure exactly what you mean by this.

CONF files are pasted below.

Let me try to explain the issue better. I have 3 indices specified in separate CONF files, A.CONF B.CONF C.CONF. Each of these indices has a unique set of columns, and ingests different sources of data from different directories.

The problem is that indices B and C contain all columns from indices A, B, and C when I view the index in Kibana, rather than just the columns I specified in the CONF file.

I hope this explanation is more clear.

**A.CONF**
input {
    	file {
    		path => "/var/elk/csv/sep/*.csv"
    		start_position => "beginning"  
    	}
    }  
    filter {
    	csv {
    	separator => ","
    	columns => ["Computer_Name","Measure_Date","Pattern_Date",
    				"Operating_System","Client_Version","Policy_Serial",
    				"HI_Status","Status","Auto_Protect_On","Worst_Detection",
    				"Last_Scan_Time","Antivirus_engine_On","Download_Insight_On",
    				"SONAR_On","Tamper_Protection_On","Intrusion_Prevention_On",
    				"IE_Browser_Protection_On","Firefox_Browser_Protection_On",
    				"Early_Launch_Antimalware_On","Server_Name","MAC_Address1",
    				"cmdb_name","cmdb_friendly","cmdb_model_id","cmdb_serial","found"]
    	}
    }
    output {
    	elasticsearch {
    	hosts => "http://localhost:9200"
    	index => "sep-index"
    	}
    }        
**B.CONF**
    input {
        	file {
        		path => "/var/elk/csv/tasks/*.csv"
        		start_position => "beginning"  
        	}
        }  
        filter {
        	csv {
        	separator => ","
        	columns => ["number","priority","state","assignment_group",
                        "short_description","sys_class_name",
                        "sys_created_on","closed_at","duration"]
        	}
        }
        output {
        	elasticsearch {
        	hosts => "http://localhost:9200"
        	index => "tasks-index"
        	}
        }
**C.CONF**
        input {
        	file {
        		path => "/var/elk/csv/incidents/*.csv"
        		start_position => "beginning"  
        	}
        }  
        filter {
        	csv {
        	separator => ","
        	columns => ["number","opened_at","short_description",
                        "priority","state","subcategory","closed_at",
                        "close_notes","current_impact",
                        "future_impact","calendar_duration"]
        	}
        }
        output {
        	elasticsearch {
        	hosts => "http://localhost:9200"
        	index => "incidents-index"
        	}
        }

Unless you explicitly define multiple pipelines (see https://www.elastic.co/guide/en/logstash/master/multiple-pipelines.html), Logstash effectively concatenates all configuration files so that all events from all inputs are passed through all filters and reach all outputs. If you don't want this you need to wrap filters and outputs in conditionals.

Thank you Magnus for your help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.