SystemMemoryOutofException thrown while indexing files as an attachment

Hi all,

I'm getting an error while using IndexMany for indexing documents as attachments. Yesterday, When I published the application on server it worked fine. But when I try to run it from visual studio it is throwing this error. Can someone help me in solving this.

string path = "D:\ElasticShare\";

            List<string> filesList = new List<string>();
                string[] filePath = Directory.GetFiles(path, "*", SearchOption.AllDirectories);
                foreach (string f in filePath)
            catch (Exception ex)

                foreach (string file in filesList)
                    Attachment attach = new Attachment
                        Name = Path.GetFileNameWithoutExtension(file),
                        Content = Convert.ToBase64String(File.ReadAllBytes(file)),
                        ContentType = Path.GetExtension(file)

                    var doc = new Document()
                        Id = counter,
                        Title = Path.GetFileNameWithoutExtension(file),
                        FilePath = Path.GetFullPath(file), //added to get the path of the file
                        File = attach
            catch (Exception e)

                var response = client.IndexMany(list, "reader"); //throwing an error at this place.
            catch (Exception ex)

Error Details

Please let me know if additional details are required.

Are you trying to index one single big document containing all those attachments?

If you have a lot of files, I guess this is generating a very big bulk request which does not fit in memory.

I'd reduce the bulk size and send may be 5 or 10 documents only at a time.

Is this error happening on elasticsearch side or on your client side?

Hi @dadoonet

No, I'm adding all the documents to a list and using IndexMany, I'm indexing the list.
Each documents refers to an attachment.
In this example I have 20 documents.

I was able to index 100 documents yesterday and also even now using the application that is published on server which is the same code as now. But all of sudden, it started throwing this error.

I think the error is happening on the elasticsearch side. I have attached a small snap of the error that is thrown. You can please have a look at it once.


Can you show elasticsearch logs?

I think this error is on your client.

No. I was asking for elasticsearch logs.

I guess you are running elasticsearch somewhere right?

Look in logs dir.

I think to get log details, I need to enable slow logging right? (correct me if I'm wrong. I have never tried logging. so not sure if I'm thinking in the right direction or not).

But I can see elasticsearch.log and elasticsearch_index_indexing_slowlog.log and elasticsearch_index_search_slowlog text files. The last two files are empty where as the file is showing something like this. (Not sure if you are looking for the same or not. Sorry)

elasticsearch.log is the right one. But please prefer copying the logs then doing some screenshots. </> helps to format.

So here, elasticsearch does not send any out of memory exception.

Definitely this is something which is happening on the client side.

Either add more memory to your process/machine (I don't know how you do this in .Net world) or reduce the size of the document you are sending to elasticsearch.

BTW: did you see fscrawler project?

Sorry for that. I had to post the screenshot because of the limitation of words in the reply body.

Either add more memory to your process/machine (I don't know how you do this in .Net world) or reduce the size of the document you are sending to elasticsearch.

Yeah will try to run the same code on different machine and check if it is working or not.

But when we are indexing individual document , will reducing the size help?

Just started working on it. Your application scope seems to be very big than my requirement. So I'm just taking hints from your application scope and creating a small one. And moreover I didnt understand how /from where to start with from your project :pensive:. Any pointers on how to use it in the existing project? TIA

I just saw that you added some documentation for the same. Thank you.
Will go through it and get back to you If I have any doubts regarding the same.

Yeah. I did that just some hours ago! :slight_smile:

1 Like

The same code is working perfectly fine on a console application. :frowning:
I even compared the content using text compare tool and everything is same except the index name.
This is totally confusing now.

Your IDE is using much more memory I presume.

Oh okok. Thanks @dadoonet Will check on some other device then.

And one more thing, Will the performance of the query depend on the platform we are working on? Because I think console application is taking more time than the normal web application to fetch results. Anything to work on to improve the performance? (Will the number of shards and replicas affect the performance? because I'm going with default number 5 for each which I dont use at all.)

The only number/field you should look at when you run your queries is "took".

Based on that you can see if your query is "slow" or not whatever your client is.

I think as you said its the problem with the IDE.
Because I was able to index 20 documents. But when i have some 50 documents the error was thrown near client.IndexMany(list,"indexName").
Still facing this issue. Don't understand how to solve the problem.

I'd probably reduce the size of the list (in dev) and would send 10 docs per 10 docs.
In prod, you can may be set another value for this setting.

But again I'm not a .Net guy so I don't know anything about your env :slight_smile:

May be @Martijn_Laarman have an idea?

You could also ask at where discussion about clients is happening?

thanks David. But it would be the same when the size is changed to more number in prod.
Can I share this issue link in the forum link you have posted?

Can I share this issue link in the forum link you have posted?

Of course. You are free to paste any link you want in a forum : :wink:

Haha.. Ok thanks.. Just wanted to if this issue link is clear enough or should I make a new one

Ha! I got your point now.

Well IMHO it's better to sum up the discussion.