Header issue in logstash csv output

hi all, i am using following logstash configuration:

input {
elasticsearch {
hosts => ["localhost:9200"]
index => "disp_2019-04-08"
}
}

filter {
mutate {
  add_field => {
   "P_date" => "%{PDate}"
   
  }
  }
mutate { gsub => [ "PDate", "-", "/" ] }

mutate {
	add_field => {
		"Disp_log" => "%{PDate}|%{Date}|%{Time}|%{PacketSerial}"
	}
	}
}
output {
csv {
fields => ["Disp_log"]
path => "E:/logstashlog/disp_logs/disp_%{P_date}.csv"
csv_options => {
        "write_headers" => true
        "headers" =>["PDate|Date|Time|PacketSerial"]
}
}
  stdout { codec => rubydebug }

}

but the problem is that, when i started the logstash, in the created csv output, there is a header for every record while it is needed to have just one record for csv output file. how can i handle this issue? could you please advise me? many thanks.

I believe that is working as expected. There is an issue open to change that.

is there any way to have just one header for all records?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.