This builder doesn't allow terms that are larger than 1,000 characters

Hello,

Kibana 7.6.2
Elasticsearch 7.6.2
Ubuntu 18.04

When i want to filter a keyword in discover in kibana, i have this error :

{
  "took": 10,
  "timed_out": false,
  "_shards": {
    "total": 14,
    "successful": 12,
    "skipped": 0,
    "failed": 2,
    "failures": [
      {
        "shard": 0,
        "index": "crawlpr-2020.04.17",
        "node": "DkbjPH_JS-mW78Bv_er3iA",
        "reason": {
          "type": "illegal_argument_exception",
          "reason": "This builder doesn't allow terms that are larger than 1,000 characters, got java.lang.RuntimeException: Unable to parse raw availability [Tijdelijk niet beschikbaar (web)]\n\tat com.workit.crawl.medimarket.data.fields.offers.main.fields.availability.AvailabilityParser.parseItemAvailability(AvailabilityParser.java:50)\n\tat com.workit.crawl.medimarket.data.fields.offers.main.fields.availability.AvailabilityParser.apply(AvailabilityParser.java:38)\n\tat com.workit.crawl.medimarket.data.fields.offers.main.fields.availability.AvailabilityParser.apply(AvailabilityParser.java:15)\n\tat com.workit.crawl.parsing.api.java8.logging.ParsingLogger.apply(ParsingLogger.java:40)\n\tat com.workit.crawl.parsing.api.java8.logging.Logging.applyAndLog(Logging.java:51)\n\tat com.workit.crawl.parsing.api.java8.model.offer.OfferParserTemplate.apply(OfferParserTemplate.java:90)\n\tat com.workit.crawl.medimarket.data.fields.offers.main.MainOfferParser.apply(MainOfferParser.java:40)\n\tat com.workit.crawl.medimarket.data.fields.offers.OffersParser.apply(OffersParser.java:24)\n\tat com.workit.crawl.medimarket.data.fields.offers.OffersParser.apply(OffersParser.java:13)\n\tat com.workit.crawl.parsing.api.java8.logging.ParsingLogger.apply(ParsingLogger.java:40)\n\tat com.workit.crawl.parsing.api.java8.logging.Logging.applyAndLog(Logging.java:51)\n\tat com.workit.crawl.parsing.api.java8.model.offer.OfferSpecificationParsingTemplate.apply(OfferSpecificationParsingTemplate.java:112)\n\tat com.workit.crawl.medimarket.data.OfferSpecificationsParser.extractOfferSpecification(OfferSpecificationsParser.java:56)\n\tat com.workit.crawl.medimarket.data.OfferSpecificationsParser.handleOfferSpecification(OfferSpecificationsParser.java:37)\n\tat com.workit.crawl.medimarket.data.OfferSpecificationsParser.apply(OfferSpecificationsParser.java:30)\n\tat com.workit.crawl.medimarket.page.OfferPageParser.parse(OfferPageParser.java:29)\n\tat com.workit.crawl.medimarket.page.OfferPageParser.parse(OfferPageParser.java:12)\n\tat com.workit.crawl.medimarket.page.PageParser.applyFirstMatchedParser(PageParser.java:13)\n\tat com.workit.crawl.medimarket.plugins.CrawlOfferAction.parseResponseAndStoreResult(CrawlOfferAction.java:70)\n\tat com.workit.crawl.medimarket.plugins.CrawlOfferAction.doAction(CrawlOfferAction.java:50)\n\tat com.workit.crawl.processor.service.ActionExecutor.executeAction(ActionExecutor.java:76)\n\tat com.workit.crawl.processor.ProcessorManager.doAction(ProcessorManager.java:253)\n\tat com.workit.crawl.processor.ProcessorManager.executeAction(ProcessorManager.java:206)\n\tat com.workit.crawl.processor.ProcessorManager.processActionForFoundOperationAndTarget(ProcessorManager.java:168)\n\tat com.workit.crawl.processor.ProcessorManager.processForFoundSite(ProcessorManager.java:180)\n\tat com.workit.crawl.processor.ProcessorManager.processForFoundOperation(ProcessorManager.java:149)\n\tat com.workit.crawl.processor.ProcessorManager.processForFoundJob(ProcessorManager.java:128)\n\tat com.workit.crawl.processor.ProcessorManager.processForFoundExecution(ProcessorManager.java:114)\n\tat com.workit.crawl.processor.ProcessorManager.process(ProcessorManager.java:97)\n\tat com.workit.crawl.processor.queue.dynamic.channel.ActionMessageProcessorUsingChannel.onDelivery(ActionMessageProcessorUsingChannel.java:444)\n\tat com.workit.crawl.processor.queue.dynamic.channel.ActionMessageProcessorUsingChannel.handleMessage(ActionMessageProcessorUsingChannel.java:436)\n\tat com.workit.crawl.processor.queue.dynamic.channel.ActionMessageProcessorUsingChannel.handleDelivery(ActionMessageProcessorUsingChannel.java:315)\n\tat com.rabbitmq.client.impl.ConsumerDispatcher$5.run(ConsumerDispatcher.java:144)\n\tat com.rabbitmq.client.impl.ConsumerWorkService$WorkPoolRunnable.run(ConsumerWorkService.java:99)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\n"
        }
      }
    ]
  },
  "hits": {
    "total": 169,
    "max_score": null,
    "hits": []
  },
  "aggregations": {
    "2": {
      "buckets": [
        {
          "key_as_string": "2020-04-17T15:03:30.000+02:00",
          "key": 1587128610000,
          "doc_count": 8
        },
        {
          "key_as_string": "2020-04-17T15:04:00.000+02:00",
          "key": 1587128640000,
          "doc_count": 13
        },
        {
          "key_as_string": "2020-04-17T15:04:30.000+02:00",
          "key": 1587128670000,
          "doc_count": 7
        },
        {
          "key_as_string": "2020-04-17T15:05:00.000+02:00",
          "key": 1587128700000,
          "doc_count": 7
        },
        {
          "key_as_string": "2020-04-17T15:07:00.000+02:00",
          "key": 1587128820000,
          "doc_count": 5
        },
        {
          "key_as_string": "2020-04-17T15:08:30.000+02:00",
          "key": 1587128910000,
          "doc_count": 7
        },
        {
          "key_as_string": "2020-04-17T15:09:00.000+02:00",
          "key": 1587128940000,
          "doc_count": 10
        },
        {
          "key_as_string": "2020-04-17T15:09:30.000+02:00",
          "key": 1587128970000,
          "doc_count": 12
        },
        {
          "key_as_string": "2020-04-17T15:10:00.000+02:00",
          "key": 1587129000000,
          "doc_count": 9
        },
        {
          "key_as_string": "2020-04-17T15:10:30.000+02:00",
          "key": 1587129030000,
          "doc_count": 5
        },
        {
          "key_as_string": "2020-04-17T15:11:30.000+02:00",
          "key": 1587129090000,
          "doc_count": 4
        },
        {
          "key_as_string": "2020-04-17T15:12:00.000+02:00",
          "key": 1587129120000,
          "doc_count": 6
        },
        {
          "key_as_string": "2020-04-17T15:12:30.000+02:00",
          "key": 1587129150000,
          "doc_count": 9
        },
        {
          "key_as_string": "2020-04-17T15:13:00.000+02:00",
          "key": 1587129180000,
          "doc_count": 4
        },
        {
          "key_as_string": "2020-04-17T15:13:30.000+02:00",
          "key": 1587129210000,
          "doc_count": 7
        },
        {
          "key_as_string": "2020-04-17T15:14:30.000+02:00",
          "key": 1587129270000,
          "doc_count": 8
        },
        {
          "key_as_string": "2020-04-17T15:15:00.000+02:00",
          "key": 1587129300000,
          "doc_count": 11
        },
        {
          "key_as_string": "2020-04-17T15:15:30.000+02:00",
          "key": 1587129330000,
          "doc_count": 5
        },
        {
          "key_as_string": "2020-04-17T15:16:00.000+02:00",
          "key": 1587129360000,
          "doc_count": 9
        },
        {
          "key_as_string": "2020-04-17T15:16:30.000+02:00",
          "key": 1587129390000,
          "doc_count": 9
        },
        {
          "key_as_string": "2020-04-17T15:17:00.000+02:00",
          "key": 1587129420000,
          "doc_count": 14
        }
      ]
    }
  }
}

But, if i try the same query in dev tool, it works :

GET crawlpr-*/_search
{
  "query": {
    "match_phrase": {
      "stackTrace": {
        "query": "java.lang.RuntimeException: Unable to parse raw availability [Tijdelijk niet beschikbaar (web)]\n\tat com.workit.crawl.medimarket.data.fields.offers.main.fields.availability.AvailabilityParser.parseItemAvailability(AvailabilityParser.java:50)\n\tat com.workit.crawl.medimarket.data.fields.offers.main.fields.availability.AvailabilityParser.apply(AvailabilityParser.java:38)\n\tat com.workit.crawl.medimarket.data.fields.offers.main.fields.availability.AvailabilityParser.apply(AvailabilityParser.java:15)\n\tat com.workit.crawl.parsing.api.java8.logging.ParsingLogger.apply(ParsingLogger.java:40)\n\tat com.workit.crawl.parsing.api.java8.logging.Logging.applyAndLog(Logging.java:51)\n\tat com.workit.crawl.parsing.api.java8.model.offer.OfferParserTemplate.apply(OfferParserTemplate.java:90)\n\tat com.workit.crawl.medimarket.data.fields.offers.main.MainOfferParser.apply(MainOfferParser.java:40)\n\tat com.workit.crawl.medimarket.data.fields.offers.OffersParser.apply(OffersParser.java:24)\n\tat com.workit.crawl.medimarket.data.fields.offers.OffersParser.apply(OffersParser.java:13)\n\tat com.workit.crawl.parsing.api.java8.logging.ParsingLogger.apply(ParsingLogger.java:40)\n\tat com.workit.crawl.parsing.api.java8.logging.Logging.applyAndLog(Logging.java:51)\n\tat com.workit.crawl.parsing.api.java8.model.offer.OfferSpecificationParsingTemplate.apply(OfferSpecificationParsingTemplate.java:112)\n\tat com.workit.crawl.medimarket.data.OfferSpecificationsParser.extractOfferSpecification(OfferSpecificationsParser.java:56)\n\tat com.workit.crawl.medimarket.data.OfferSpecificationsParser.handleOfferSpecification(OfferSpecificationsParser.java:37)\n\tat com.workit.crawl.medimarket.data.OfferSpecificationsParser.apply(OfferSpecificationsParser.java:30)\n\tat com.workit.crawl.medimarket.page.OfferPageParser.parse(OfferPageParser.java:29)\n\tat com.workit.crawl.medimarket.page.OfferPageParser.parse(OfferPageParser.java:12)\n\tat com.workit.crawl.medimarket.page.PageParser.applyFirstMatchedParser(PageParser.java:13)\n\tat com.workit.crawl.medimarket.plugins.CrawlOfferAction.parseResponseAndStoreResult(CrawlOfferAction.java:70)\n\tat com.workit.crawl.medimarket.plugins.CrawlOfferAction.doAction(CrawlOfferAction.java:50)\n\tat com.workit.crawl.processor.service.ActionExecutor.executeAction(ActionExecutor.java:76)\n\tat com.workit.crawl.processor.ProcessorManager.doAction(ProcessorManager.java:253)\n\tat com.workit.crawl.processor.ProcessorManager.executeAction(ProcessorManager.java:206)\n\tat com.workit.crawl.processor.ProcessorManager.processActionForFoundOperationAndTarget(ProcessorManager.java:168)\n\tat com.workit.crawl.processor.ProcessorManager.processForFoundSite(ProcessorManager.java:180)\n\tat com.workit.crawl.processor.ProcessorManager.processForFoundOperation(ProcessorManager.java:149)\n\tat com.workit.crawl.processor.ProcessorManager.processForFoundJob(ProcessorManager.java:128)\n\tat com.workit.crawl.processor.ProcessorManager.processForFoundExecution(ProcessorManager.java:114)\n\tat com.workit.crawl.processor.ProcessorManager.process(ProcessorManager.java:97)\n\tat com.workit.crawl.processor.queue.dynamic.channel.ActionMessageProcessorUsingChannel.onDelivery(ActionMessageProcessorUsingChannel.java:444)\n\tat com.workit.crawl.processor.queue.dynamic.channel.ActionMessageProcessorUsingChannel.handleMessage(ActionMessageProcessorUsingChannel.java:436)\n\tat com.workit.crawl.processor.queue.dynamic.channel.ActionMessageProcessorUsingChannel.handleDelivery(ActionMessageProcessorUsingChannel.java:315)\n\tat com.rabbitmq.client.impl.ConsumerDispatcher$5.run(ConsumerDispatcher.java:144)\n\tat com.rabbitmq.client.impl.ConsumerWorkService$WorkPoolRunnable.run(ConsumerWorkService.java:99)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\n"      }
    }
  }
}

The same filter keyword in discover view in elastic 5.6.0 / Kibana 5.6.0 works.

Why i have this error ?

It's indexation failed or query failed ? Elasticsearch or kibana failed ?

Thank you in advance.

I have forget mapping :

{
  "crawlpr-2020.04.17" : {
    "mappings" : {
      "dynamic" : "false",
      "_meta" : { },
      "_source" : {
        "includes" : [ ],
        "excludes" : [ ]
      },
      "dynamic_templates" : [ ],
      "properties" : {
        "@timestamp" : {
          "type" : "date"
        },
        "@version" : {
          "type" : "keyword"
        },
        "actionClass" : {
          "type" : "keyword"
        },
        "allowRedirection" : {
          "type" : "boolean"
        },
        "attempt" : {
          "type" : "long"
        },
        "attemptStatus" : {
          "type" : "keyword"
        },
        "breadCrumbsFound" : {
          "type" : "long"
        },
        "breadCrumbsStored" : {
          "type" : "long"
        },
        "breadcrumb" : {
          "type" : "keyword"
        },
        "captchaProvider" : {
          "type" : "keyword"
        },
        "captchaType" : {
          "type" : "keyword"
        },
        "className" : {
          "type" : "keyword"
        },
        "configuredProxyName" : {
          "type" : "keyword"
        },
        "cookieRejected" : {
          "type" : "boolean"
        },
        "cookieSpec" : {
          "type" : "keyword"
        },
        "createTaskError" : {
          "type" : "keyword"
        },
        "createTaskResult" : {
          "type" : "keyword"
        },
        "directive" : {
          "type" : "keyword"
        },
        "downloadUrl" : {
          "type" : "keyword"
        },
        "downloadedFileCount" : {
          "type" : "long"
        },
        "entityType" : {
          "type" : "keyword"
        },
        "error" : {
          "type" : "keyword"
        },
        "exception" : {
          "type" : "keyword"
        },
        "executionId" : {
          "type" : "keyword"
        },
        "fileStored" : {
          "type" : "boolean"
        },
        "filterField" : {
          "type" : "keyword"
        },
        "filterToken" : {
          "type" : "keyword"
        },
        "filterType" : {
          "type" : "keyword"
        },
        "filtered" : {
          "type" : "boolean"
        },
        "host" : {
          "type" : "keyword"
        },
        "hostAndThreadName" : {
          "type" : "keyword"
        },
        "jobId" : {
          "type" : "keyword"
        },
        "jobName" : {
          "type" : "keyword"
        },
        "linksFound" : {
          "type" : "long"
        },
        "linksStored" : {
          "type" : "long"
        },
        "logId" : {
          "type" : "keyword"
        },
        "message" : {
          "type" : "keyword"
        },
        "method" : {
          "type" : "keyword"
        },
        "noFileStorageReason" : {
          "type" : "keyword"
        },
        "offerSpecificationId" : {
          "type" : "keyword"
        },
        "offersFound" : {
          "type" : "long"
        },
        "offersStored" : {
          "type" : "long"
        },
        "operationId" : {
          "type" : "keyword"
        },
        "pageInaccessible" : {
          "type" : "boolean"
        },
        "pageUrl" : {
          "type" : "keyword"
        },
        "parentOperationId" : {
          "type" : "keyword"
        },
        "parentTargetUrl" : {
          "type" : "keyword"
        },
        "pluginName" : {
          "type" : "keyword"
        },
        "pluginVersion" : {
          "type" : "keyword"
        },
        "proxy" : {
          "type" : "keyword"
        },
        "proxyConfiguration" : {
          "type" : "keyword"
        },
        "proxyHostAndPort" : {
          "type" : "keyword"
        },
        "proxyName" : {
          "type" : "keyword"
        },
        "proxyRemoteClient" : {
          "type" : "keyword"
        },
        "pushResult" : {
          "type" : "keyword"
        },
        "pushType" : {
          "type" : "keyword"
        },
        "redirect" : {
          "type" : "boolean"
        },
        "redirectedFrom" : {
          "type" : "keyword"
        },
        "remoteHost" : {
          "type" : "keyword"
        },
        "remotePushDurationInMillis" : {
          "type" : "long"
        },
        "remoteServiceId" : {
          "type" : "keyword"
        },
        "remoteServiceName" : {
          "type" : "keyword"
        },
        "remoteServiceUrl" : {
          "type" : "keyword"
        },
        "remoteSiteId" : {
          "type" : "keyword"
        },
        "remoteSiteName" : {
          "type" : "keyword"
        },
        "requestBody" : {
          "type" : "keyword"
        },
        "requestHeaders" : {
          "properties" : {
            "Host" : {
              "type" : "keyword"
            },
            "JSESSIONID" : {
              "type" : "keyword"
            },
            "User-Agent" : {
              "type" : "keyword"
            }
          }
        },
        "resolutionResult" : {
          "type" : "keyword"
        },
        "resolutionTime" : {
          "type" : "keyword"
        },
        "resource" : {
          "type" : "keyword"
        },
        "responseCode" : {
          "type" : "long"
        },
        "responseHeaders" : {
          "properties" : {
            "Server" : {
              "type" : "keyword"
            }
          }
        },
        "responseTime" : {
          "type" : "long"
        },
        "scheme" : {
          "type" : "keyword"
        },
        "sessionEndReason" : {
          "type" : "keyword"
        },
        "sessionId" : {
          "type" : "keyword"
        },
        "sessionUseTimes" : {
          "type" : "long"
        },
        "sessionWillContinue" : {
          "type" : "boolean"
        },
        "siteDomain" : {
          "type" : "keyword"
        },
        "siteId" : {
          "type" : "keyword"
        },
        "siteName" : {
          "type" : "keyword"
        },
        "siteSessionId" : {
          "type" : "keyword"
        },
        "stackTrace" : {
          "type" : "keyword"
        },
        "storedFileSizeInBytes" : {
          "type" : "long"
        },
        "success" : {
          "type" : "boolean"
        },
        "tags" : {
          "type" : "keyword"
        },
        "targetUrl" : {
          "type" : "keyword"
        },
        "triggerReasons" : {
          "type" : "keyword"
        },
        "tunnelOpened" : {
          "type" : "boolean"
        },
        "type" : {
          "type" : "keyword"
        },
        "uid" : {
          "type" : "keyword"
        },
        "url" : {
          "type" : "keyword"
        },
        "urlFound" : {
          "type" : "keyword"
        }
      }
    }
  }
}

It's the same mapping for my both cluster of elasticsearch 5 & 7

Can you take a look at the browser requests and see if any of the requests are failing? It would be helpful if you could post a har file of the failure.

Looking back over the requests - I suspect this might be a character escaping issue.

Thanks,
Matt

Hi,

sorry for delay.

Request :

http://pral-cpl-kibana01.workit.fr:5601/elasticsearch/crawlpr-*/_search?rest_total_hits_as_int=true&ignore_unavailable=true&ignore_throttled=true&preference=1588079944342&timeout=30000ms

Payload :

{"version":true,"size":500,"sort":[{"@timestamp":{"order":"desc","unmapped_type":"boolean"}}],"aggs":{"2":{"date_histogram":{"field":"@timestamp","fixed_interval":"30s","time_zone":"Europe/Paris","min_doc_count":1}}},"stored_fields":[""],"script_fields":{},"docvalue_fields":[{"field":"@timestamp","format":"date_time"}],"_source":{"excludes":[]},"query":{"bool":{"must":[],"filter":[{"match_all":{}},{"match_phrase":{"stackTrace":"java.lang.RuntimeException: java.lang.NullPointerException\n\tat com.workit.crawl.processor.service.ActionExecutor.executeAction(ActionExecutor.java:90)\n\tat com.workit.crawl.processor.ProcessorManager.doAction(ProcessorManager.java:253)\n\tat com.workit.crawl.processor.ProcessorManager.executeAction(ProcessorManager.java:206)\n\tat com.workit.crawl.processor.ProcessorManager.processActionForFoundOperationAndTarget(ProcessorManager.java:168)\n\tat com.workit.crawl.processor.ProcessorManager.processForFoundSite(ProcessorManager.java:180)\n\tat com.workit.crawl.processor.ProcessorManager.processForFoundOperation(ProcessorManager.java:149)\n\tat com.workit.crawl.processor.ProcessorManager.processForFoundJob(ProcessorManager.java:128)\n\tat com.workit.crawl.processor.ProcessorManager.processForFoundExecution(ProcessorManager.java:114)\n\tat com.workit.crawl.processor.ProcessorManager.process(ProcessorManager.java:97)\n\tat com.workit.crawl.processor.queue.dynamic.channel.ActionMessageProcessorUsingChannel.onDelivery(ActionMessageProcessorUsingChannel.java:447)\n\tat com.workit.crawl.processor.queue.dynamic.channel.ActionMessageProcessorUsingChannel.handleMessage(ActionMessageProcessorUsingChannel.java:439)\n\tat com.workit.crawl.processor.queue.dynamic.channel.ActionMessageProcessorUsingChannel.handleDelivery(ActionMessageProcessorUsingChannel.java:319)\n\tat com.rabbitmq.client.impl.ConsumerDispatcher$5.run(ConsumerDispatcher.java:144)\n\tat com.rabbitmq.client.impl.ConsumerWorkService$WorkPoolRunnable.run(ConsumerWorkService.java:99)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\nCaused by: java.lang.NullPointerException\n\tat com.workit.crawl.soriana.plugins.CrawlOfferAction.doAction(CrawlOfferAction.java:52)\n\tat com.workit.crawl.processor.service.ActionExecutor.executeAction(ActionExecutor.java:79)\n\t... 16 more\n"}},{"range":{"@timestamp":{"gte":"2020-04-28T13:12:54.008Z","lte":"2020-04-28T13:27:54.008Z","format":"strict_date_optional_time"}}}],"should":[],"must_not":[]}},"highlight":{"pre_tags":["@kibana-highlighted-field@"],"post_tags":["@/kibana-highlighted-field@"],"fields":{"":{}},"fragment_size":2147483647}}Preformatted text

Response :

{"took":8,"timed_out":false,"_shards":{"total":14,"successful":12,"skipped":0,"failed":2,"failures":[{"shard":0,"index":"crawlpr-2020.04.28","node":"DkbjPH_JS-mW78Bv_er3iA","reason":{"type":"illegal_argument_exception","reason":"This builder doesn't allow terms that are larger than 1,000 characters, got java.lang.RuntimeException: java.lang.NullPointerException\n\tat com.workit.crawl.processor.service.ActionExecutor.executeAction(ActionExecutor.java:90)\n\tat com.workit.crawl.processor.ProcessorManager.doAction(ProcessorManager.java:253)\n\tat com.workit.crawl.processor.ProcessorManager.executeAction(ProcessorManager.java:206)\n\tat com.workit.crawl.processor.ProcessorManager.processActionForFoundOperationAndTarget(ProcessorManager.java:168)\n\tat com.workit.crawl.processor.ProcessorManager.processForFoundSite(ProcessorManager.java:180)\n\tat com.workit.crawl.processor.ProcessorManager.processForFoundOperation(ProcessorManager.java:149)\n\tat com.workit.crawl.processor.ProcessorManager.processForFoundJob(ProcessorManager.java:128)\n\tat com.workit.crawl.processor.ProcessorManager.processForFoundExecution(ProcessorManager.java:114)\n\tat com.workit.crawl.processor.ProcessorManager.process(ProcessorManager.java:97)\n\tat com.workit.crawl.processor.queue.dynamic.channel.ActionMessageProcessorUsingChannel.onDelivery(ActionMessageProcessorUsingChannel.java:447)\n\tat com.workit.crawl.processor.queue.dynamic.channel.ActionMessageProcessorUsingChannel.handleMessage(ActionMessageProcessorUsingChannel.java:439)\n\tat com.workit.crawl.processor.queue.dynamic.channel.ActionMessageProcessorUsingChannel.handleDelivery(ActionMessageProcessorUsingChannel.java:319)\n\tat com.rabbitmq.client.impl.ConsumerDispatcher$5.run(ConsumerDispatcher.java:144)\n\tat com.rabbitmq.client.impl.ConsumerWorkService$WorkPoolRunnable.run(ConsumerWorkService.java:99)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\nCaused by: java.lang.NullPointerException\n\tat com.workit.crawl.soriana.plugins.CrawlOfferAction.doAction(CrawlOfferAction.java:52)\n\tat com.workit.crawl.processor.service.ActionExecutor.executeAction(ActionExecutor.java:79)\n\t... 16 more\n"}}]},"hits":{"total":56,"max_score":null,"hits":},"aggregations":{"2":{"buckets":[{"key_as_string":"2020-04-28T15:12:30.000+02:00","key":1588079550000,"doc_count":1},{"key_as_string":"2020-04-28T15:13:30.000+02:00","key":1588079610000,"doc_count":6},{"key_as_string":"2020-04-28T15:15:00.000+02:00","key":1588079700000,"doc_count":6},{"key_as_string":"2020-04-28T15:15:30.000+02:00","key":1588079730000,"doc_count":1},{"key_as_string":"2020-04-28T15:17:00.000+02:00","key":1588079820000,"doc_count":6},{"key_as_string":"2020-04-28T15:17:30.000+02:00","key":1588079850000,"doc_count":3},{"key_as_string":"2020-04-28T15:18:30.000+02:00","key":1588079910000,"doc_count":7},{"key_as_string":"2020-04-28T15:19:00.000+02:00","key":1588079940000,"doc_count":2},{"key_as_string":"2020-04-28T15:19:30.000+02:00","key":1588079970000,"doc_count":1},{"key_as_string":"2020-04-28T15:20:00.000+02:00","key":1588080000000,"doc_count":1},{"key_as_string":"2020-04-28T15:20:30.000+02:00","key":1588080030000,"doc_count":2},{"key_as_string":"2020-04-28T15:21:00.000+02:00","key":1588080060000,"doc_count":2},{"key_as_string":"2020-04-28T15:21:30.000+02:00","key":1588080090000,"doc_count":2},{"key_as_string":"2020-04-28T15:22:00.000+02:00","key":1588080120000,"doc_count":2},{"key_as_string":"2020-04-28T15:22:30.000+02:00","key":1588080150000,"doc_count":1},{"key_as_string":"2020-04-28T15:23:00.000+02:00","key":1588080180000,"doc_count":4},{"key_as_string":"2020-04-28T15:23:30.000+02:00","key":1588080210000,"doc_count":2},{"key_as_string":"2020-04-28T15:24:30.000+02:00","key":1588080270000,"doc_count":1},{"key_as_string":"2020-04-28T15:25:00.000+02:00","key":1588080300000,"doc_count":4},{"key_as_string":"2020-04-28T15:25:30.000+02:00","key":1588080330000,"doc_count":2}]}}}

Thanks for advance

regards

Laurent

org.elasticsearch.transport.RemoteTransportException: [pral-cplels-data03.workit.fr][10.10.3.9:9300][indices:data/read/search[phase/fetch/id]]
Caused by: java.lang.IllegalArgumentException: This builder doesn't allow terms that are larger than 1,000 characters

Seems is elasticsearch who refused my query. I don't know why. The same query in my elasticsearch 5.6 works.

I'm having trouble reproducing the problem. Are you running any plugins? can you share your elasticsearch.yaml and kibana.yaml files? (remove sensitive info)

Hello,

No plugin.

elasticsearch.yml

cluster.name: pral-cplels-cluster

path:
    data: /var/lib/elasticsearch
    logs: /var/log/elasticsearch
    
node.name: ${HOSTNAME}

network.host: 0.0.0.0

discovery.seed_hosts:
        -  ***
 
node.master: true
node.voting_only: false 
node.data: true
node.ingest: true
node.ml: false
xpack.ml.enabled: false
cluster.remote.connect: false

bootstrap.memory_lock: true

Kibana.yml

server.host: "0.0.0.0"
elasticsearch.hosts: [***]

Thanks you

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.