[MachineLearning/DataVisualizer/ImportData] - File too large (?)

Hi here,
Thanks in advance for your time

I was trying to use the new machine learning tool and it worked well for a first test fill
Then, I've been trying with a fill much larger (header of 9 fields, 45120 lines of data) and the "Data analyzing load" never ended.
So, I tried to split this fill with the first 15 000 lines and it worked fine but, after adding the next 1 000 lines, I've got the same behavior again ...
So I tried to check if something was wrong between the line 15 000 and 16 000 but I'm able to load a file with the lines from 14 000 to 17 000 ...

The only issue I can think about now is that my file is too large (?). Even if my fill is < 100MB (2,9MB)


I've seen this "never ending" behavior when there were odd characters in the file (for example, DOS-like newline chars like ^M).

I would look at the file closely in a text editor that can show you hidden symbols (such as using :set list in vi) and ensure there are no wacky symbols in there. Keep in mind that the $ symbol for a newline in vi is okay.

Hi @richcollier, thanks a lot for your answer,
I thought you had found my issue but after removing every single ^M I've got the same result ...
Using :set list in vi, I can confirm that all my lines look like aa;bb;cc;dd$ now
Still able to import a file with my first 15 000 lines, cannot figure out what's wrong here

Also tried with another test file looking like field1;field2;field3;field4;field5;field6;field7;field8;field9 for the header and 45 000 data lines like aaa;bbb;ccc;dddd;eeee;ffff;gggg;hhhh;iiii
Same wrong behavior


Hmmm... I'm not sure how we can help unless we had your problematic file (or an anonymized version of it) to reproduce the issue. Any way to provide?

Are you able/allowed to download my test file from my google drive here : https://drive.google.com/open?id=1s8Iz-kALilsBfayipATKSqlU9MwBKTwk

If not, how do you want me to share it with you ?

Thanks! Was able to download the file and it imported perfectly!


Feeling stupid ... Haha
I've tried again but nothing changed
Is there any configuration to modify / enable on the server. Any logs to verify ?
Thanks for your time

Perhaps enabling developer mode/console on the browser to see where things might be getting stuck for you might be helpful ??)

Are you doing this locally (i.e. running ES/Kibana on your local machine) or is this on Elastic Cloud or is the destination system far away (network-wise?). Is there a possible proxy/firewall in place that is disallowing uploads of certain sizes? Just trying to think outside the box here....

Thanks again for your very helpful answer
As you suggested, I opened the network console of my browser and I've found the issue
Nothing about Kibana but our network, I've got the 413 http status - Request Entity Too Large
Issue closed !
Thanks for your help and time

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.