Why is it so hard to just grab a file and upload to Elasticsearch?

so, I was wondering this today: why is it so hard to just grab a log file and punch it into Elasticsearch? I understand when you need to setup Filebeat listeners, other beats, some Logstash pipelines and such... but what about if you just have a really big Apache log, or a huge CSV file, that you just want to easily take a look using Elastic?

I found one "easy" way to do such a thing with .evtx files, but everything else... nop. Tips? Suggestions? Pointers?

Have you tried the File Import Wizard in the Data Visualizer?

Its a "Basic Feature" under Machine Learning...

Not made for huge files but quick way to get some data in and take a look...



I have not! But sadly that's not a way for me at work: the DLP solution at work here prevents any file from being uploaded via browser, even if the destination is localhost:5601...

I've finally got people to configure the DLP solution the right way, and was able to use the File Import Wizard! Awesome, @stephenb! Thanks a lot!

Is there a way around the 100MB limit?

Another way to ingest data is to use one of the language clients for Elasticsearch. Granted, you'll need to write some code, but it is a flexible approach.

1 Like

yeah... I'm not good at programming in general. I can do some bash/batch though =p I was trying to make a bash using tshark + bulky API to load some pcaps, but for some reason, after I deleted the packt-* index the first time, no matter how many times I tried again, nothing would show up, no matter how much time curl took trying to XPUT my json files in elasticsearch...

((edit)) kudos to me: I'm working on Windows and forgot tshark would output packets.json with CRLF instead of just LF. ((/edit))

Oh, I was following this article, btw!

I described here a recipe using logstash: https://www.elastic.co/blog/enriching-your-postal-addresses-with-the-elastic-stack-part-1

You can also use filebeat and ingest node pipeline with this plugin


David, I have to say it: I loved that post. LOVED it. The way you start slowly and build up things, tearing them apart and explaining what's going on and why you built each thing their way is just AMAZING. I wish more people would follow this idea!

I tried using logstash a couple times, but even trying to follow the documentation was kinda hard for me, when the configuration files just came over and BAM do this! I always get like... why? What's each of those parts doing? I dunno if there's something wrong in my head, but it's really hard for me. Your post, on the other hand... BEAUTIFUL. Specially the "Writing the Logstash Pipeline" part.

((edit)) I had no idea so far that I could cat something pipe into logstash to load them into Elasticsearch!!! :scream: ((/edit))

Thank you so very much!


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.