Logstash TCP Client configuration

I'm looking for an better example to connect to a TCP server and save the results of a CSV stream to a file or directly into elastic. Does anyone have good example to work with?

Sample of CSV Data:
1ADLAA1,GSM,1458229111.998,I,SAI ,I,003.030.004,007.128.007, 529R ,BN900002,00,3502, 140445570240, 5119550011305718, , ,007,006,000,000,09,ITU Transaction Begin,ITU Completion Invoke,

LS won't connect to a TCP server, it can only listen on a port.

I found the solution to my own problem. I was able to connect to a socket and collect the data from the system. Here is my config:

input {
tcp {
mode => client
host => "192.168.25.93"
port => "5001"}
}

filter {
csv {
columns => ['unit','prot','time','dir','msg','dir msu','opc','dpc','link','linkset','slc','corr','ogt','dgt','cgpc','cdpc','cgssn','cdssn','cgtt','cdtt','sccpmsg','package','component']
#convert => { "column1" => "string",}
separator => ","}

test things here

}

#output { stdout { codec => rubydebug }
output {
elasticsearch {

index => "Peg%{IndexType}-%{+YYYY.MM.dd}"

}

}

then run it with:

/opt/logstash/bin/logstash -f stub.conf

and paste in some input that will trigger the filters.