Key of Elasticsearch

Hi,
I would like to connect my DB with ES through Logstash. My DB doesn't have one unique key but has multiple columns as key. For example, if DB has A, B, C, and D columns and combination of A, B, and C are unique. However, value of each column (A, B, and C) can be duplicated.
or
In case of JSON file, it doesn't have unique key attribution but has combination of multiple attribution as key (Same as DB case).

How can I index these case of DB or JSON file into ES? If someone have any idea of this, please let me know as soon as possible because I should finish my project...
Thanks,

You will need to combine (concatenate) different fields using a filter in the Logstash pipeline processing. Be aware of potential performance decrease depending on how your new ids will look like.

Thank you for your help. I have another question. Two documents which are in different types but same index can have same unique id? For example,
Type A
Document: "id":"abc", ...
Type B
Document: "id":"abc", ...

No worries!

No it's not possible to my knowledge. Anyhow I guess it's good to avoid it for obvious reasons.

IDs are unique within a type and index. These are different docs but both have an ID of "1":

DELETE test
PUT test/foo/1
{
    "msg":"Hello"
}
PUT test/bar/1
{
    "msg":"goodbye"
}
GET test/bar/1
GET test/foo/1
1 Like

Great to know Mark! Thanks for pointing it out!

Thank you so much. It is very helpful!