Restrict Logstash to map values to its specific Columns Using CSV filter

Hi everyone,
I am trying to store data from CSV files into Elasticsearch using Logstash.
i have a logstash configuration using the csv filter containing columns say
csv{ columns => ["A", "B", "C","D","E"] seperator => "," skip_empty_columns => true }
and say if my CSV file has columns [B,C,D,E] and values for it ie, there is no column named "A".
Eg file:

B,C,D,E,
1,2,3,4,

when i run my Logstash with this particular configuration, what happens is that Logstash stores data in Elasticsearch this way.

A=1
B=2
C=3
D=4

Column E does not get stored, Since i have given the option of
skip_empty_columns => true
Column "E" is omitted.

But i want my data to be stored in Elasticsearch in this way

B=1
C=2
D=3
E=4

ie, i want the data to correctly map to the particular Column even though there are other columns configured.
I am looking for a solution where Logstash has to take values and map them to the respective Column and not in some order.
Can anyone suggest a solution for this?

I'd say your CSV files are broken. If the A column doesn't have a value I'd expect the line to begin with a comma.

I am looking for a solution where Logstash has to take values and map them to the respective Column and not in some order.

How would Logstash know which column values map to which column names, if the data doesn't follow the column configuration?