Is there a way to bulk upload without logstash - tried elasticsearch-py and stuck

All,

Is there a way to bulk upload without logstash e.g. lets say it would be impossible for me to setup a logstash server but had ability to run a AWS Lambda to push to ES Hosted Elasticsearch?

I am trying following but completely stuck:

from elasticsearch import Elasticsearch
from elasticsearch import helpers
import json
from datetime import datetime
import cx_Oracle

con = cx_Oracle.connect('user/password@host/service')
print con.version

sql = "select field1, field2 from table_a where  os.datetimestamp_created >= sysdate - interval '10' minute"

cur = con.cursor()
cur.execute(sql)
columns = [column[0] for column in cur.description]

print columns
results = []

for row in cur.fetchall():
    results.append(dict(zip(columns,row)))

print results

es = Elasticsearch(
['https://eshost.us-east-1.aws.found.io:9243'],
http_auth=('elastic','password')
)

cur.close()
con.close()

Appreciate help, i'm stuck. Can Lambda be used? If not why not? What alternatives do i have?

Hey,

yes it is possible. You might want to take a look at this blog post and the tools used in there

--Alex

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.