Receiving data via HTTPs or Ingest Pipelines


I've been searching on the forums and haven't been able to find an answer to the problem I'm facing.

The issue I'm facing:
I am trying to send data via REST Api to an Elastic Cloud instance, in my head, I am sure I have seen this work before at a previous company but I didn't set it up and it was on a previous version of Elasticsearch. The data being sent will be in JSON format. I have looked at the documentation for Watcher HTTP and also on Ingest Pipelines, but in setting it up, it doesn't actually say if this is what I require.

My question is, what is the correct way to send JSON data via Restful API to Elasticsearch. If anyone can point me towards the documentation or a guide. In terms of our cluster, it is the latest version of Elasticsearch.

Thank you!

Hi @sc12 Welcome to the community and thanks for Using Elastic Cloud.

First just to get it off the table Watcher is not the proper method to ingest data, it is used to Alerts on data .... lets come back to that after you get some data flowing.

2nd Yes Elasticsearch is just a REST HTTPS Endpoint that is used to ingest (or query) data into / from Elasticsearch

3rd Ingest pipelines may or may not be needed, think of the them as the Transform part of ETL, if you want want to parse, normalize enhance the incoming data you can do it there.

Finally there are many ways to send data to Elasticsearch because of point 2 above.

There are many Prebuilt Language Clients see here

And there are many different tools that have native connectors to Elasticsearch including Elastic's own Beats, Elastic Agent and Logstash.

And You can use Postman... or even curl to test

Perhaps if you told us a bit about the source and type of the data and what you want to accomplish in a little more detail we might be able to provide some suggestions.

Thank you @stephenb for the clear explanation.

My knowledge of Elasticsearch is modest at best but I understand what you mean. So initially my understanding of what we could do was, our developers can send to an endpoint a JSON file with data and we can process this in Elasticsearch and build visualizations. They are using an HTTP sender and the request is in Json.

So in theory here is what they will be pushing to us:

“SupplierID” : “Supplier name”,
“TransKey”: “null/Md5”,
“Type” : “CSV”,
“InterfaceName” : “InBound/outbound”,
“ChannelName” : “ChannelName”,
“Tags” : {tag1 : val1, tag2 : val 2 etc} 
“LogMsg” : “Actual log message”,


Yes and you will what to create a Mapping and Index Template so the fields are stored / treated the way you want looks like you have mostly keywords and / or text data

You can literally just POST that using the document API (fixing the few typos) and it will create and index and you can take a look at the default mapping which you will want to clean up.

You can create the template and mapping via the API or through Kibana - > Stack Management -> Index Management.

That will be a pretty simple template and mapping.

Template creates a matching pattern and then sets the index settings, and the mapping sets the field types... you can do it all in one

Then after you get that right you will want to think about how long you want to store the Data ... you will look at Index Lifecycle Management

I would start simple and work your way through.

1 Like

Thanks for sending these through, I've been working through this and created an index with the relevant mappings. In terms of how to then send the data, that's what I'm now stumped on, I can't install filebeat/logstash on the server so the only way I am able to send out the data is via an API. In my head, I know what it looks like but can't for the life of me find it in the documentation or settings.

Here are the raw REST APIs

The Document APIs are for inserting single or bulk documents.

Of course if you are a python or Java etc person there are language clients which I posted the link above.

Or use Postman or curl etc... Anything that can talk REST

1 Like

Appreciate the help @stephenb this has answered my questions and given me much food for thought (and knowledge)!