To store and search a File content using elastic search

My requirement is an FAQ in my application where I require to Upload FAQ documents and later search and find these documents using the keywords in search cloumn. I Would like to know how to store and search a File content in Elastic search. Currently I am Using Firebase and Node.js for my application. I have created an account and your Hosted Elastic search . I couldn't find any documentation or proper instruction in your page for the implementation of File upload search. KIndly provide a solution

1 Like

There's no file upload in elasticsearch.

You can use the ingest attachment plugin.

There an example here:

PUT _ingest/pipeline/attachment
  "description" : "Extract attachment information",
  "processors" : [
      "attachment" : {
        "field" : "data"
PUT my_index/_doc/my_id?pipeline=attachment
  "data": "e1xydGYxXGFuc2kNCkxvcmVtIGlwc3VtIGRvbG9yIHNpdCBhbWV0DQpccGFyIH0="
GET my_index/_doc/my_id

The data field is basically the BASE64 representation of your binary file.

You can use FSCrawler. There's a tutorial to help you getting started. FSCrawler also exposes a REST Service which can be used for file upload. See

1 Like

In the above explanation the file can be retrieved by its id or can we retrieve the file by its content

1 Like


Elasticsearch has a huge amount of search queries:
Hope you find what you are looking for.

Is it Possible to use LogStash for the searching file stored in firebase . if yes ,please guide me in the pipeline structure and share documentation link as i am little confused with it.

Logstash is not used for searching. It is used to insert data into elasticsearch. You use Elasticsearch for searching. You can send search requests through Kibana, Postman (windows) or CURL.

so by using logstash i can insert files from firebase and later use search from elastic search>??


LogStash uses pipeline structure right. can you please provide the implementation guide of logstash and the valid pipeline structure
Here is the logstash documentation:
What a pipeline can look like:

input {
	elasticsearch {
		hosts => "localhost"
		index => "Index"

filter {
   date {
    match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss.SSS"]
	target => "@timestamp"

output {
stdout { codec => rubydebug }
	elasticsearch {
		hosts => [""]
		document_type => "intexName"
		index => "IndexName"

Good luck

thank you

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.