Telnet Data Feed logstash into elastic

I'm a N00b to ELK and would like so assistance setting up a data collection from our pbx. it generates reports via socket in a CSV data format via TCP port 2001. How do I setup logstash to connect to the socket and parse the data?

Example Data via Telnet:

NAS DISPOSITION SPREADSHEET,DEMO1,All Correlation Sets are Selected,JSchwartz_Elastic_Logstash_Test,01/27/2016,19:40:00,01/27/2016,19:41:00,ISUP,0,43552,1028,0,0,15,0,487,971,2360,2,0,10,31,0,7,0,5,6,24,1,3,0,326,182,6,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,2,0,85,0,2,11,0,0,491,0,0,0,0,0

Logstash config:

filter {
csv {
columns => [
"Report_Name",
"UnitID",
"Links",
"Filtername",
"Sdate",
"Stime",
"Edate",
"Etime",
"Protocol",
"Data1",
"Data2",
"Data3",
"Data4",
"Data5",
"Data6",
"Data7",
"Data8",
"Data9",
"Data10",
"Data11",
"Data12",
"Data13",
"Data14",
"Data15",
"Data16",
"Data17",
"Data18",
"Data19",
"Data20",
"Data21",
"Data22",
"Data23",
"Data24",
"Data25",
"Data26",
"Data27",
"Data28",
"Data29",
"Data30",
"Data31",
"Data32",
"Data33",
"Data34",
"Data35",
"Data36",
"Data37",
"Data38",
"Data39",
"Data40",
"Data41",
"Data42",
"Data43",
"Data44",
"Data45",
"Data46",
"Data47",
"Data48",
"Data49",
"Data50",
"Data51",
"Data52",
"Data53",
"Data54",
"Data55",
"Data56" ]
separator => ","
remove_field => ["Filtername"]
}
}

output {
elasticsearch {hosts => ["localhost:9200"]
action => "index"
index => "teknoadmin_disposition"
}
stdout { codec=> rubydebug}
}

Thank you!

So... the PBX dumps that output to any client connecting to port 2001, then closes the connection? Or what's the exact behavior?

System has a socket buffer, for example if you have putty connected it will keep on displaying the data until the connection is dropped. if you let it run for a while and not connect to it will buffer the data until the tcp connection happens again.

Data is always streams and the report generation time can be changed from 1 minute up to 1 day.

Is the assumption here that you'd need to connect to a socket from LS to read this data? If so LS can't do that, it can only open a socket to listen on.

Though I think a socket input plugin would be handy, this isn't the first time I have seem this sort of use.

LS does not have to feed to a socket, My pbx is the "server" and just like http feed it accept any connection to send the data to the client.

Connection is open as long as the report writer is generating a report. When you connect to the socket it will then send all the data in the buffer and keep sending data to the client. If the client disconnect then the server will start buffering until the next connection client connection is made. Connection never closes until you tell the report writer to stop generating the report and is a one for one connection. Only one client app can connect to that socket if the client has an active session.

Kinda like a simple tcp chat app except you are seeing all the raw data via a single socket as a moderator. You can not provide input just listen to the stream of data via an admin socket port.

If needed i can post a youtube example of how the feed works to better visualize whats going on. Thank you for the support. If i can make this work this will allow me to use kibana and create a management dashbaord of report data.

I'm sure other have tcp raw streams from other systems that need to be consumed. Any thoughts on the problem?

So Logstash does not act like Netcat. The packets have to be pushed to a socket and LS will listen?

What if linux netcat pipe could send the data to logstash?