That's why your put mapping call is failing. You can not update an existing field.
Any solution to add an extra property in mapping
If you mean : adding a new field, then that will work.
If you mean changing an existing field: you need to reindex
"analyzer": "Custom_Casing_Analyzer",
"search_analyzer": "Custom_Casing_Analyzer",
this analyzer need to add (No need to change existing mappings)
how i can add this?
You MUST reindex.
while creating an index is it possible to set mapping for all columns with that analayzer?
Yes.
How we can do that?
ElasticsearchResponse<Stream> rRes = objClient.IndicesCreate<Stream>(strIndex, @"
{
""settings"": {
""analysis"": {
""analyzer"": {
""Custom_Casing_Analyzer"": {
""type"": ""custom"",
""tokenizer"": ""keyword"",
""filter"": [
""lowercase""
]
}
}
}
}
}");
for this entity I have 10 columns, how will set for those 10 columns?
DELETE test
PUT test
{
"settings": {
"analysis": {
"analyzer": {
"Custom_Casing_Analyzer": {
"type": "custom",
"tokenizer": "keyword",
"filter": [
"lowercase"
]
}
}
}
},
"mappings": {
"doc": {
"properties": {
"foo1": {
"type": "text",
"analyzer": "Custom_Casing_Analyzer"
},
"foo2": {
"type": "text",
"analyzer": "Custom_Casing_Analyzer"
}
}
}
}
}
foo1 and foo2 are the column name right?
Is there any way to apply analyser without giving column name (apply all)?
Because number of columns is more for the index
Thanks it worked..
But for one index when adding data got following issue
Document contains at least one immense term in field="Template.Files.de.tCertificateTemplateHTML" (whose UTF8 encoding is longer than the max length 32766), all of which were skipped. Please correct the analyzer to not produce such terms. The prefix of the first immense term is: '[60, 109, 101, 116, 97, 32, 104, 116, 116, 112, 45, 101, 113, 117, 105, 118, 61, 34, 99, 111, 110, 116, 101, 110, 116, 45, 116, 121, 112, 101]...', original message: bytes can be at most 32766 in length; got 40669
How I can increase that limit?
I have some template data as HTML, that string is too long
Ok. So I'm going to answer without knowing a lot. My answer might be wrong.
Why do you want to use keyword tokenizer for html content?
That data contains client data so not able to share.
Tokeninzer anything is fine. with that column no search will happen.
other string columns i need above setting and mappings
If no search is happening on that field, then don't index it. https://www.elastic.co/guide/en/elasticsearch/reference/6.2/mapping-index.html
Sorry
I need to get complete data for that index,
I dont have any querying with that column as match case
But in result i need that column also
document will look something like this, but length is longer than this
That's exactly what's I proposed.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.