Migrate from NEST 6.6 to ElasticSearch client 8

I've recently upgraded my Elasticsearch server from 6.6 to 8.5 (quite a jump I know but it was long overdue). The main process to create/updated indexes is written in C# and was previously using NEST.

I originally asked this question at c# - Migrate NEST 6.6 to Elasticsearch client 8. Unclear on how to define analyzers - Stack Overflow as well. My issue is I am trying to convert the following NEST code:

  index => index.Settings(
    settings => settings.Analysis(
       analysis => analysis.TokenFilters(
       tokenFilter => tokenFilter.Synonym("synonym", syn => syn.SynonymsPath("analysis/synonym.txt"))
    ).Analyzers(analyzers => analyzers
       .Custom("mycustom", cust => cust
         .Filters("stop", "synonym").Tokenizer("standard")

to the new client.
I tried:

Client.Indices.Create(index => index.Settings(
 settings => setting.Analysis(
   analysis => analysis
     .Filter(tokenFilter => tokenFilter.Add(
        "synonym", new TokenFilter(new TokenFilterDefinitions(
            // This is where I start getting lost
            new Dictionary<string, ITokenFilterDefinition> {
            { "synonym", new SynonymTokenFilter() { // What are the keys meant to be?
                  SynonymsPath = "analysis/synonym.txt"
             } } }))))
     .Analyzer(analyzers =>
        analyzers.Custom("mycustom", cust => cust.Filter(new[] {"stop", "synonym"})

because this is the only way I could get the code to compile (and looked sort of sensible)
but this creates the following request (fragment):

 "filter": {
        "synonym": {
          "synonym": {
            "synonyms_path": "analysis/synonym.txt",
            "type": "synonym"

Logically I would expect to be able to get around this by doing:

tokenFilter.Add("synonym", new SynonymTokenFilter() { 
   SynonymsPath = "analysis/synonym.txt"

but SynonymTokenFilter does not seem to match the type signature required for Add(string, TokenFilter) so I don't know what I can do.

Hi, @apokryfos.

Thanks for raising this. Unfortunately, this is a blocking bug, and I've opened an issue for us to review the code-generation of the token filter types.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.