Hi,
I'm facing this issue. I have indexed 200 records. In each record the body has huge amount of text data nearly 1000+ pages. I have tried increasing the heap size upto 16gb in jvm options xms16g and xmx16g. So there are very long strings in one field. I have set [index.highlight.max_analyzed_offset] more than it is required. But still I'm not able resolve. I think is it out of memory issue ?
Please help me out of this..
Hi @rdeshmukh, welcome to the discussion boards!
Are there any errors in your server logs? Based on your screenshot, it appears you're running on Elastic Cloud -- if you can't access the logs, I'd suggest opening a support ticket so that the appropriate engineers can take a closer look at this.
Hi @Larry_Gregory,
Can we insert a very large string approximately 50,00,000 (5000K) characters in one field while indexing a record. I'm able to insert it. But while fetching the data in kibana I'm getting request timeout issue. I'm using AWS Elasticsearch. Can you please helpme out with this I really need it. I'm searching for solution from past 1 month.
I'd be happy to help if you're able to retrieve logs from the Kibana instance.
Also, if you click on the Inspect
link on the right side of the Discover window, you can see the request that Kibana makes to Elasticsearch. Can you try to run that request against Elasticsearch directly, and report back?
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.