I'm a N00b to ELK and would like so assistance setting up a data collection from our pbx. it generates reports via socket in a CSV data format via TCP port 2001. How do I setup logstash to connect to the socket and parse the data?
Example Data via Telnet:
NAS DISPOSITION SPREADSHEET,DEMO1,All Correlation Sets are Selected,JSchwartz_Elastic_Logstash_Test,01/27/2016,19:40:00,01/27/2016,19:41:00,ISUP,0,43552,1028,0,0,15,0,487,971,2360,2,0,10,31,0,7,0,5,6,24,1,3,0,326,182,6,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,2,0,85,0,2,11,0,0,491,0,0,0,0,0
Logstash config:
filter {
csv {
columns => [
"Report_Name",
"UnitID",
"Links",
"Filtername",
"Sdate",
"Stime",
"Edate",
"Etime",
"Protocol",
"Data1",
"Data2",
"Data3",
"Data4",
"Data5",
"Data6",
"Data7",
"Data8",
"Data9",
"Data10",
"Data11",
"Data12",
"Data13",
"Data14",
"Data15",
"Data16",
"Data17",
"Data18",
"Data19",
"Data20",
"Data21",
"Data22",
"Data23",
"Data24",
"Data25",
"Data26",
"Data27",
"Data28",
"Data29",
"Data30",
"Data31",
"Data32",
"Data33",
"Data34",
"Data35",
"Data36",
"Data37",
"Data38",
"Data39",
"Data40",
"Data41",
"Data42",
"Data43",
"Data44",
"Data45",
"Data46",
"Data47",
"Data48",
"Data49",
"Data50",
"Data51",
"Data52",
"Data53",
"Data54",
"Data55",
"Data56" ]
separator => ","
remove_field => ["Filtername"]
}
}
output {
elasticsearch {hosts => ["localhost:9200"]
action => "index"
index => "teknoadmin_disposition"
}
stdout { codec=> rubydebug}
}
Thank you!