View Query in PostgreSQL agent in dashboard

Hi community
I need to obtain in a dashboard, two specific queries to the databases of all the agents from which I receive data (for this purpose it is this entire project among other things for which I have gotten involved with Elastic)
I have the integration with PostgreSQL done that comes by default, but I don't know if it is the correct way to do it, since I know that ES cannot read data from PostgreSQL directly, and I have to do these two queries to obtain data:

select * from cc_usertoinvoiceview;
select * from cc_version;

How do you recommend I do this?
Thanks !!!!!!!!!

Hey

I'm not sure you are asking the right question or using the right tool.

Are you saying that you want to fetch the content of 2 tables into Elasticsearch?
If so, I don't see the relationship with "agents" which are here to collect logs and metrics...

May be you want to use the PostgreSQL connector?

Or you can also send your data from your application layer to Elasticsearch within the same transaction where you are sending the data to PostgreSQL.

Could you tell more about what your are looking for?

Hi @dadoonet and thanks for reply.

I didn't explain myself well, sorry, I'll do it again.
(All the machines that I am going to detail below are managed by me)

  1. I have a server with ELASTIC+KIBANA
  2. Connected to the fleet server on that same server, several remote agents/machines in production
  3. Each of those servers runs PostgreSQL.
  4. I need to make two queries on the databases of all those servers and display them in a dashboard in kibana.
  5. I need to "paint" graphically on a dashboard as I said before, the data from those queries.

I hope I have explained myself better, I attached a diagram.
Thank you.

I still don't see how "select * from cc_usertoinvoiceview;" relates to Elastic Fleet Server...
May be someone else has a better understanding than me and could help.

@dadoonet It is an example of one of the queries that I want to make against the databases that run in Postgresql on my agents/remote machines.
What I want is to make a certain query on the databases and have the data returned to me, and that data to be displayed on a dashboard.
I don't understand that you don't understand... it's normal.
The same as with any integration, but with specific queries.

For example, these integrations are to visualize metrics and data from the databases of the remote agents/servers connected to my elastic.

imagen

Collecting PostgreSQL logs is not related to collect the content of the database itself.
That's why it's a different use case and a different tool.

In short:

  • You need logs or metrics (because you are a DBA for example), use Elastic Agent/Fleet with the PostgreSQL integration
  • You need to collect data from a database, because you are a end user, a developer.... Use The PostgreSQL connector or build your own thing from the application which is writing the data to the database.

2 different ways for 2 different use cases.

1 Like

Once again thanks for the response.
I need exactly what I mentioned in my first post.
Perform two specific queries against those databases, which return the result, and display that result on a dashboard.
To give an example, these databases save in one of the tables the number of users connected to the application, with which that database is connected.
By doing a query against that database (just to give an example, I'm not saying that it is exactly select * from cc_usertoinvoiceview;) I get the number of connected users as a result.
Well, the same thing, but view that data in a dashboard in Kibana.
Thank you.

So you need to run every x minutes/seconds a script which fetches that information from the database.
Then build a json document and send it to Elasticsearch.
Then you can visualize that information in Kibana.

To perform the first steps, you can:

  • write your own code / script
  • use Logstash
  • use the PostgreSQL connector
1 Like

Once again @dadoonet , thanks for the help.
I understand that it is not possible to execute the query from the same PostgreSQL integration, right? Is that option not viable?
On the other hand, given your experience, of the options you give me 2 and 3, which one do you recommend the most?
Thank you.

My own preference is: 1, 3, 2.

1 Like

As extra information and another contribution to the possible solution.

Some other ideas:

Send your existing data to elasticsearch. That means:

  • Read from the database (SELECT * from TABLE)
  • Convert each record to a JSON Document
  • Send the json document to elasticsearch, preferably using the _bulk API.

Logstash can help for that. But I'd recommend modifying the application layer if possible and send data to elasticsearch in the same "transaction" as you are sending your data to the database.

I shared most of my thoughts there: https://david.pilato.fr/blog/2015-05-09-advanced-search-for-your-legacy-application/

Have also a look at this "live coding" recording.

1 Like

Thank you so much @dadoonet
Now I will review all the documentation, but then would you recommend Logstash more? ... Would it be easier to implement?

As I said earlier, my personal preference goes from (1 has higher preference, 3 lesser preference):

  1. write your own code / script
  2. use the PostgreSQL connector
  3. use Logstash

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.