The best way to achieve CSV imports?


(Nick Harper) #1

Hi

I currently use Sphinx and import hundreds of product feeds into this.

I am just trying to get this working with Elastic Search but I need to figure out the best way to:

  1. I have mapped the product ID to the Elastic Search ID, I assume going forward it will just update this record instead of creating a new one?
  2. How do I remove old products no longer in the feeds? I was thinking to have an "updated" field which has the current time and date when it was added / updated and if older than a week then removing it with some sort of cron job?

I am using Embulk to import the CSV files I am testing at the moment using this method: http://www.embulk.org/docs/recipe/scheduled-csv-load-to-elasticsearch-kibana5.html

Any pointers would be greay.


(Mark Walkom) #2
  1. Yep.
  2. Or have an "active" field to save you having to remove things, plus it let's you see product history.

#3

We can also use logstash to import data from csv to elasticsearch. The sample example is as shown below


(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.