Logstash not loading csv file to elasticsearch

input {
	file {
path => "/home/sigma/Desktop/docker-elk/logstash/*.csv"
start_position => "beginning"
	sincedb_path => "/dev/null"
  }
}

## Add your filters / logstash plugins configuration here
filter {
  csv {
		separator => ","
    columns => [ "record_id", "duration", "src_bytes", "dest_bytes" ]
 }
}

output {
	elasticsearch {
		hosts => ["http://elasticsearch:9200"]
		action => "index"
		index => "network"
	}
}

Hello @Gowtham_Gandham ,
As I can see you haven't mentioned the csv name. I can see 2 mistakes in your conf file.

  1. Please provide the name of csv file,
  2. Please include csv codec in your output stdout.
    also you don't need to mention action
  1. i have given csv file name and running docker compose it is not giving any error but data is not creating index to elastic search

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.