LaaKii
(Laa Kii)
February 26, 2020, 8:11am
1
I already indexed a simple SQL Table with an id and a couple of other values like name, etc.
But when it comes to indexing a table with foreign keys, im quite lost how to configure logstash.
My database is similar to this:
Table1{
id int,
userName varchar,
userNickname varchar,
idTable2 int,
idTable3 int
}
Table2{
id int,
address varchar
}
Table3{
id int,
macAdress varchar,
ipAdress varchar
}
So what's the best way to index a table which contains foreign keys? I can't be the first one trying this, or am I missing something?
I found a similar question, but there wasn't any answer it.
Hi,
I'd like to inject data from sql databases into Elasticsearch.
Let say I have 3 tables: Company, Department and Employee.
Each company has several department and each department has several employees.
This example has 2 level of nested informations.
[DBMS%20ER%20Diagram]
I'd like to build and index a Company document like this:
{
"comp_name": "World Company",
"comp_address": " 123 blablabla street...",
"departments": [
{
"dep_id": 1,
"dep_name": "Human ressources",
…
Thanks in advance!
system
(system)
Closed
March 25, 2020, 8:11am
2
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.