New ES user. Got the ELK stack built. Now I'm importing just north of 100Gb in log files, from 2014.
They are, of course, gzip'd.
Can I do this with the 'file' ? Or do I need to throw it at ES with curl?
New ES user. Got the ELK stack built. Now I'm importing just north of 100Gb in log files, from 2014.
They are, of course, gzip'd.
Can I do this with the 'file' ? Or do I need to throw it at ES with curl?
Or use input stdin:
zcat file | bin/logstash -f logstash.conf
Thank you, David.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.