Headers in CSV File

Hello.

How can I get ELK to ignore the first line in my CSV file (the headers). I keep getting the error:

[2018-12-18T16:43:18,640][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"delete5", :_type=>"logs", :_routing=>nil}, 2018-12-18T21:43:18.298Z lswpffusapn1u.nam.nsroot.net Date,Year,Month,Client,Execution Account,Source,SharedBook,I/E/H,FirmId,TraderLogin,Trader,Region,City,Exchange,ProductName,AssetClass,CTM,P&L,CFOXType,TradeType,HandleInst,Manual Fill,Strategy,HFT,Gratis,Sales Region,Legal Entity Region,Lots,OrderCount,FillCount,PerOfLots,PerOfOrders,PerOfFills,exchTraderId,execBrokerCode,tradingPlatform,responsiblePerson], :response=>{"index"=>{"_index"=>"delete5", "_type"=>"logs", "_id"=>"AWfDResxZze2tBVL2TrF", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [Date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "Date""}}}}}

This is what my CSV file looks like:

Date,Year,Month,Client,Execution Account,Source,SharedBook,I/E/H,FirmId,TraderLogin,Trader,Region,City,Exchange,ProductName,AssetClass,CTM,P&L,CFOXType,TradeType,HandleInst,Manual Fill,Strategy,HFT,Gratis,Sales Region,Legal Entity Region,Lots,OrderCount,FillCount,PerOfLots,PerOfOrders,PerOfFills,exchTraderId,execBrokerCode,tradingPlatform,responsiblePerson
"2017-09-20 00:00:00.0","2017","9","XXXX","TTTTT","ABC","FGH","E","XYZ","XYZ","XYZ","XYZ","XYZ","XYZ","XYZ-","ABs","GHI-00000","GHI","H","GHI","GHI","false","GHI","false","false","GHI","GHI","10","1","1","0","0","0","GH5555I","SSB","QQQ","PO0090"

I'm trying to use a .config file to upload the data:

input {
file {
path => "/tmp/path/fakeMIScsv.csv"
start_position => "beginning"
sincedb_path =>"dev/null"
}

}
filter {
csv {
separator => ","
columns => ["Date","Year","Month","Client","Execution Account","Source","SharedBook","I_E_H","FirmId","TraderLogin","Trader","Region","City","Exchange","ProductName","AssetClass","CTM","P_L","CFOXType","TradeType","HandleInst","Manual Fill","Strategy","HFT","Gratis","Sales Region","Legal Entity Region","Lots","OrderCount","FillCount","PerOfLots","PerOfOrders","PerOfFills","exchTraderId","execBrokerCode","tradingPlatform","responsiblePerson"]
}
mutate{
convert => {
"Lots" => "integer"
"OrderCount" => "integer"
"FillCount" => "integer"
"PerOfLots" => "integer"
"PerOfOrders" => "integer"
"PerOfFills" => "integer"
}
}
date {
target => "Date"
match => [ "Date", "yyyy-MM-dd HH:mm:ss" ]
}
}

output {
elasticsearch {
hosts=> ["abc","xyz","abcd"]
index => "delete4"
user => *****
password => *****
ssl => true
ssl_certificate_verification => false
cacert => '/path/to/cert/ca.crt'
}
stdout {}
}

Have a look at https://www.elastic.co/guide/en/logstash/current/plugins-filters-csv.html#plugins-filters-csv-skip_header

Otherwise you need a drop with a conditional.

Would removing the columns option and adding the "autodetect_column_names" option work aswell? Or is that not recommended?

It works fine :slight_smile:

Hello,
Thank you for your response. I get the following error when I try to use that plugin:

[2018-12-19T10:43:53,193][ERROR][logstash.filters.csv ] Unknown setting 'skip_header' for csv
[2018-12-19T10:43:53,203][ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Something is wr ong with your configuration."}

What version are you on?

I'm on Version: 5.6.12

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.