Import old postgres data to elasticsearch


(Luvpreet Singh) #1

I have a lot of data in my postgres database( on a remote). This is the data of the past 1 year, and I want to push it to elasticsearch now.

The data has a time field in it in this format 2016-09-07 19:26:36.817039+00.

I want this to be the timefield(@timestamp) in elasticsearch. So that I can view it in kibana, and see some visualizations over the last year.

I need help on how do I push all this data efficiently. I cannot get that how do I get all this data from postgres.

I know we can inject data via jdbc plugin, but I think I cannot create my @timestamp field with that.

I also know about zombodb but not sure if that also gives me feature to give my own timefield.

Also, the data is in bulk, so I am looking for an efficient solution

I need help on how I can do this. So, suggestions are welcome.


(David Pilato) #2

WDYM by jdbc "plugin"?


(Luvpreet Singh) #3

Read this.


(David Pilato) #4

So it's a logstash question. I moved your thread to #logstash.

Not sure why:

  • you think you need a @timestamp field
  • you think you can't get it from the db

Logstash date filter plugin should probably help here.


(Luvpreet Singh) #5

Actually I got a simple(not the best) solution.

I came to know that elasticsearch reads python timestamp.

I used python client for elasticsearch, and psycopg2 library to fetch data from postgres.

There were around 40 coulmns and 55000 rows of data. It took almost 3 minutes for fetching and writing the data. The solution was simple and straight forward.


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.