Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running a getAllIndices call for indices created using the default token_chars parameter throws an error #918

Closed
kved96 opened this issue Dec 13, 2024 · 2 comments

Comments

@kved96
Copy link

kved96 commented Dec 13, 2024

Java API client version

8.16.1

Java version

17

Elasticsearch Version

8.14.3

Problem description

I'm running into an issue after creating an index with an n_gram tokenizer with the token_chars field when using the below method to get all the indices in the cluster. The specific settings I'm using at create time are:

{
"index.analysis.analyzer.test_ngram_analyzer.tokenizer": "ngram_tokenizer",
"index.analysis.test_ngram_analyzer.filter" : "lowercase",
"index.analysis.test_ngram_analyzer.type": "custom", 
"index.analysis.tokenizer.ngram_tokenizer.type": "ngram",
"index.analysis.tokenizer.ngram_tokenizer.min_gram": 1,
"index.analysis.tokenizer.ngram_tokenizer.max_gram": 2

}

Then using the get indices endpoint like below:

esClient.indices()
                .get(r -> r.index("_all")
                        .allowNoIndices(true)
                        .expandWildcards(ExpandWildcard.Open, ExpandWildcard.Closed))
                .result()

Throws with:

status: 200, [es/indices.get] Failed to decode response","throwable2_message":"Error deserializing co.elastic.clients.elasticsearch._types.analysis.TokenizerDefinition: co.elastic.clients.util.MissingRequiredPropertyException: Missing required property 'NGramTokenizer.tokenChars' 

I would expect this to be patched by #877, but it doesn't look like it is.

@l-trotta
Copy link
Contributor

Hello @kved96, I'm sorry but I really can't reproduce the issue, I'm using the client version 8.16.1, created an index in Kibana like this:

PUT ngram-test
{
  "settings": {
    "analysis": {
      "analyzer": {
        "test_ngram_analyzer": {
          "tokenizer": "ngram_tokenizer"
        }
      },
      "test_ngram_analyzer": {
        "filter" : "lowercase",
        "type": "custom"
      },
      "tokenizer": {
        "ngram_tokenizer": {
          "type": "ngram",
          "min_gram": 1,
          "max_gram": 2
        }
      }
    }
  }
}

and performing the call you provided simply returns the list of all indices settings.
I have a request: could you try getting that specific index, like this?

esClient.indices()
                .get(r -> r.index("ngram-test") // or your index name
                        .allowNoIndices(true)
                        .expandWildcards(ExpandWildcard.Open, ExpandWildcard.Closed))
                .result()

Since you're getting all indices it could be another index that's causing the problem, and not the newly created one?
Also I have to ask, since this is pretty weird, are you super sure the version of the client is 8.16.1 and it's not clashing with any other version?

Thank you for your patience.

@kved96
Copy link
Author

kved96 commented Dec 17, 2024

Hey @l-trotta, I just re-ran my test and it looks like I had something in cache that was keeping my client version at 8.14.3, can confirm that this is fixed!

@kved96 kved96 closed this as completed Dec 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants