Elasticsearch index not getting created when add_field is used

Hi,

I'm facing a problem when creating an elasticsearch index. I'm new to this, so please let me know what I could do to fix this.
I'm pasting the two config files which I've used for reference. The one at the top is working, which I used just to see whether a simple load is working or not.

      file {
        path => "C:/Users/pranay/data.csv"
        start_position => "beginning"
      }
    }

    filter {
          csv {
            columns => [ "Desc","time","util","data" ]
         }
         mutate {convert => ["time", "float"] }
         mutate {convert => ["util", "float"] }
        }

    output {
      elasticsearch { 
      hosts => ["localhost:9200"] 
      index => "network"
      }
     }

This seems to be working fine. However when I use the below block, since I need to create a kibana visualization based on certain values from the fields, I'm using this

    input {
    	file {
    		path => "C:\Users\pranay\data.csv"
    		start_position => "beginning"
    	}
    }

    filter {
    	csv {
    		columns => [ "Desc","time","util","data" ]

    	}

    	mutate {convert => ["time", "float"] }
         mutate {convert => ["util", "float"] }

      if [data] == "data_one" {
    		mutate {
    		add_field => [ "Desc", "Snap" ]
    		}
    	}
    	else if [data] == "data_two" {
    		mutate {
    		add_field => [ "Desc", "Snaptwo" ]
    		}
    	}
    	else if [data] == "data_three" {
    		mutate {
    		add_field => [ "Description", "Snapthree" ]
    		}
    	}
    } 
    output {
    	elasticsearch {
    		hosts => ["localhost:9200"]
    		index => "test"
    	}
    }

I've also checked the logstash console, it doesn't give any tracebacks per se.

Any help on this will be appreciated.

Thanks,
Pranay.

looks to me the error is around the else if conditions and the way to set the add field, instead you should use nested else conditions, try this below,

if "data_one" in [data] {
    		mutate {
				add_field => { "Desc" => ["Snap" ] }
    		}
    	}  # end if "data_one"
    	else {
			if "data_two" in [data]  {
				mutate {
					add_field => { "Desc" => ["Snaptwo" ] }
				}
			} # end if "data_two"
			else {
				if "data_three" in [data]  {
					mutate {
						add_field => { "Description" => ["Snapthree" ] }
					}
				} # end if "data_three"		
			} # end 2nd else
		} # end 1st else

also, another similar post

Do not use backslash in the path option of a file filter, it is treated as an escape. Use forward slash.

@Badger No, that is not a problem here, as I stated above, in both of the codes, I've used a similar path for the input file and I get the index generated for the 1st one, whereas couldn't get that in the second one. So clearly there is something wrong in the later part of it.

Thanks,
Pranay.

@christiancj I've tried this and unfortunately this also does not create an index :frowning_face:
Could you suggest anything else that I can give a shot?

No, you have not use a similar path. In the first one you used

path => "C:/Users/pranay/data.csv"

with forward slash, and it worked. In the second one you used

path => "C:\Users\pranay\data.csv"

with backslash and it did not work. Change the backslashes to forward slash.

@Badger I did replace the '' with '/' and it is still stuck there.

@pranayv66 - here is a conf that is working in my end, a sample index created with the if and columns conversion executed.

   input {
    	file {
			path => "C:/Users/pranay/data.csv"
    		start_position => "beginning"
			sincedb_path => "C:/Users/pranay/.since.sample.log"
    	}
    }

    filter {
	
    	csv { 
			columns => [ "Desc","time","util","data" ] 
			convert => {
				"time" => "float"
				"util" => "float"
			}
			
			}

    if "data_one" in [data] {
    		mutate {
				add_field => { "Description" => ["Snap" ] }
    		}
    	}  # end if "data_one"
    else {
			if "data_two" in [data]  {
				mutate {
					add_field => { "Description" => ["Snaptwo" ] }
				}
			} # end if "data_two"
			else {
				if "data_three" in [data]  {
					mutate {
						add_field => { "Description" => ["Snapthree" ] }
					}
				} # end if "data_three"		
			} # end 2nd else
		} # end 1st else
		
	mutate {
		remove_field => [ "column1","column2","column3","column4" ] 
	}
	
    }
	
    output {
   
	  stdout { codec => json }
    
         elasticsearch {
			hosts => ["localhost:9200"]
			index => "test"
		}
	}

sample csv - don't include column/header names in the file, just the data to ingest

abc,20.8,19.75,data_three
xya,34.5,19.5,data_two

as suggestion, don't forget to delete your sincedb path file when you're running testing (reload objects), this file track the last object/row processed.

@christiancj Thanks for the solution. This worked perfectly well !!

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.