I have files in GCS, each file contains json lines
file example:
{"msg": "hello world", "level": "info", "timestamp": "2017-09-01T00:00"}
{"msg": "some error", "level": "error", "timestamp": "2017-09-02T00:00"}
{"msg": "success" , "level": "info",, "timestamp": "2017-09-03T00:00"}
...
I am running on all files in GCS, downloading each file and Im trying to send them to logstash - so I get each line of log in elastic
here is my logstash conf:
input {
tcp {
port => 5000
}
}
output {
elasticsearch {
hosts => "https://xxxxxxxxxxx.us-central1.gcp.cloud.es.io:9243"
user => "elastic"
password => "xxxxxxx"
index => "app-log-%{+YYYY.MM.dd}"
}
}
now how do I send the file contents to logstash?
should I use a different input than tcp? maybe filebeat?
is there any demo code in nodejs (or other language) to send the files?