Hi, Spent too much time on this, would really appreciate any help. My csv file loading in one record.
My data is csv:
Number,Category,Assignment group,Technology,Assigned to,Opened,O_Month,Opened Day,Opened Date,Opened Time,Duration,Durn in hrs,Floor,TimeWindow,Closed,C_Month,PriorZZZy,Short description,Configuration ZZZem,Attribute,Env
INC0403021,Application,ZZZ MS-SQL Admin,MS_SQL,Adam SmZZZh,2019-06-30 23:05:55,Jun,Sunday,06/30/19,11:05 PM,28531,7.9,10:00 PM,10 PM - 12 AM,1/0/1900,Jan,3 - Low,Control-M DS: DMS_XXXsqldba0040_YYYY_DL_XXX_P - RC 1 - Ended not OK 6/30/2019 11:05:08 PM,Database - SQL Server,Backup,Prod
INC0403009,Application,ZZZ MS-SQL Admin,MS_SQL,Adam SmZZZh,2019-06-30 19:33:47,Jun,Sunday,06/30/19,7:33 PM,41202,11.4,6:00 PM,6 PM - 8 PM,1/0/1900,Jan,3 - Low,Control-M DS: DS_XXXsqldba0015_TRANSLOG_BCK_PROD - RC 1 - Ended not OK 6/30/2019 7:13:53 PM,Database - SQL Server,Backup,Prod
This is the logstash pipeline:
input {
beats {
port => 5045
ssl => true
ssl_certificate_authorities => ["xxxxxx.pem"]
ssl_certificate => "/etc/logstash/SSL/logstash_xxxxx.pem"
ssl_key => "/etc/logstash/SSL/logstash_xxxxx.key"
}
}
filter {
if "inc" in [tags]
{
csv {
columns => ["Number","Category","Assignment group","Technology","Assigned to","Opened","O_Month","Opened Day","Opened Date","Opened Time","Duration","Durn in hrs","Floor","TimeWindow","Closed","C_Month","Priority","Short Description","Confirguration","Attribution","Env"]
skip_header => true
}
#Date filter is used to convert date to @Timestamp so that chart in Kibana will show as per date
date {
match => [ "Opened", "yyyy-MM-dd HH:mm:ss"]
timezone => "America/New_York"
target => "Opened"
}
}
}
output {
if "inc" in [tags] {
elasticsearch {
hosts => "localhost:9200"
manage_template => false
index => "inc-%{+YYYY.MM.dd}"
}
}
}