thanks for reply.
my configuration is like below
input {
file {
path => "C:\Application\logstash\script\data\trx.csv"
start_position => "beginning"
sincedb_path => "C:\Application\logstash\trx.read"
}
}
filter {
csv {
separator => ","
autodetect_column_names => true
#remove_field => ["host","path"]
}
date{
match => [ "TRXDATE", "yyyyMMdd" ]
}
date{
match => [ "TRXTIME", "HHmmss" ]
}
mutate
{
remove_field => ["message","path","@version","@timestamp"]
}
}
output {
elasticsearch {
template => "C:\Application\logstash\script\template\trxdetail-template.json"
template_name => "trx-template"
template_overwrite => true
hosts => ["XXX.XXX.XXX:9200"]
user => "XXXXX"
password => "XXXXX"
index => "trxdetail"
document_type=>"trx"
document_id => "%{TRXDATE}-%{TRXFLOW}-%{TRXFLOWNO}"
}
}
and example line is
ID,CUSNO,CUSACC,CUSNAME,OPPCUSNO,OPPACC,OPPNAME,OPPACCFLAG,CDFLAG,AMT,TRXDATE,TRXTIME,TRXFLOW,TRXFLOWNO,ACCCODE,ACCNAME,BANKNO,BANKNAME,BRANCHNO,BRANCHNAME,TELLER,TERMINAL,CUSWILL,BUSICODE,BUSINAME,BUSICMMCODE,BUSICMM,BUSITECHCMM
"2017XXXXXXXXXXXXXXXXXXXXX","0115XXXXXXXXXXX","62XXXXXXXXXXXXXXXXXX","userXXXX",(null),"X7380XXXXXXXXXXXXXX","XXXXXXXXXXXXXXXX","inner acct","credit",619.00,20100902,"092900","EXXXXXXXXXXXX",1,"2XXXXXXXXX","TEST ACCOUNT","0XXXXXXXXXXXX","XXXXXXXXXXXX","0XXXXXXXXXXXXXX","XXXXXXXXXXXXXXXXXX","ECXXXXXX",(null),(null),"2TTTTTTTT","POSTRX","AXXXX","","PDXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX 1"