Does anyone tried to output data logs to PostgreSQL?
Right now my output is Elasticsearch and I want that every time a data will be output to elasticsearch, it will also output to PostreSQL
Does anyone tried to output data logs to PostgreSQL?
Right now my output is Elasticsearch and I want that every time a data will be output to elasticsearch, it will also output to PostreSQL
Have you looked at the logstash-output-jdbc plugin?
Not yet. Gonna look into it. Thank you @magnusbaeck!
Hello @magnusbaeck. Did you try to use this plugin? Since I have some questions regarding this plugin. Thank you.
I haven't used it but maybe I or someone else can help out anyway.
I already tried to use the plugin but i got null values in my postgre db.
output {
elasticsearch {
hosts => ["192.168.200.254:9200"]
index => "awsph_logstash-%{+YYYY.MM.dd}"
}
jdbc {
connection_string => 'jdbc:postgresql://192.168.200.2:5432/Sample?user=postgres&password=Welcome123' statement => [ "INSERT INTO logstashdata (index_id, index_name, doc_type) VALUES(?, ?, ?)", "_id", "_index", "_type" ]
}
stdout {
codec => rubydebug
}
}
i think the problem maybe is that, I can't get the value of the data like the id, index, and the doc_type.
Is there a way to pre-declare the data and transform to (maybe) variable? so that I can use it in my put statement?
Or if there any workaround with this that you may know.
Thank you!
I tried to use the data inside the "_source" array and it is working. But still the _id, _type, and _index is not working.
Still need help with this.
But still the _id, _type, and _index is not working.
Correct, because events don't have any such fields (unless you add them yourself).
Thanks for your reply @magnusbaeck!
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.