Index data - tokenizer

HI, I need to tokenize index data column into a new index. Please help me to achieve that

Can you provide an example of what you are looking to achieve?

Hi Christian,

Suppose that, this is my data in the index:

Index: data_index
[
{
"id": "123",
"data": "This is data One"
},
{
"id": "124",
"data": "This is data Two"
}
]

Now I want to save my data on new index like:

Index: token_index
[
{
"id": "123",
"data": [
"This",
"is",
"data",
"One"
]
},
{
"id": "124",
"data": [
"This",
"is",
"data",
"Two"
]
}
]

If you map your data field as text it will be tokenised. If the default is not giving the correct results you can customise it.

Actually I am new in elastic search. can you give me any example to achive this.

I'm wondering if you should give a look at the _analyze API.