I'm 100% sure it's not PHP memory problem.
PHP setting "memory_limit" is set to 128M in php.ini file.
Script that break with error "Couldnt connect to host
[123.123.123.123], Elasticsearch down?" use only 20.75 megabytes
So it's not the reason of problem.
This could be a memory problem in PHP itself. So you could findout if
that's
the case. Because insert such many items in 1 array for example, will
also
need such amount of space of memory.
I'm using latest elasticsearch release 0.12.0
I have problem with insert data to elasticsearch.
I'm using "elasticsearch" PHP client to insert data.
Insert data is done in loop.
PHP script insert 0 - 10000 entries and reload itself.
Next it insert entries from 10000 - 20000. And it reload and start
from 30000.
Script should works like this to the end. About 190000 entries.
Insert single entry works ok. Problem is that script stops every time
after insert 28233 entries.
I'm 100% sure it's not problem with inserted data of entry 28233
because starting from 10000 it will go thourgh entry 28233 without any
problem and will stop on entry 38233. So it always insert only 28233
entries.
PHP client return error.
Couldnt connect to host [123.123.123.123], Elasticsearch down?
(123.123.123.123 - is fake ip address)
This problem don't generate any log entry in log file.
Elasticsearch is not down after this problem happen.
Did anyone have same problem?
Can anyone help me to find what is the reason of this problem?
contrary to query_string (the query type the q URL parameter maps to) the
term query is not analyzed, thus in your case it searches for "kim*" token
(note, this means it is not performing wildcard query).
I'm 100% sure it's not PHP memory problem.
PHP setting "memory_limit" is set to 128M in php.ini file.
Script that break with error "Couldnt connect to host
[123.123.123.123], Elasticsearch down?" use only 20.75 megabytes
So it's not the reason of problem.
This could be a memory problem in PHP itself. So you could findout if
that's
the case. Because insert such many items in 1 array for example, will
also
need such amount of space of memory.
I'm using latest elasticsearch release 0.12.0
I have problem with insert data to elasticsearch.
I'm using "elasticsearch" PHP client to insert data.
Insert data is done in loop.
PHP script insert 0 - 10000 entries and reload itself.
Next it insert entries from 10000 - 20000. And it reload and start
from 30000.
Script should works like this to the end. About 190000 entries.
Insert single entry works ok. Problem is that script stops every
time
after insert 28233 entries.
I'm 100% sure it's not problem with inserted data of entry 28233
because starting from 10000 it will go thourgh entry 28233 without
any
problem and will stop on entry 38233. So it always insert only 28233
entries.
PHP client return error.
Couldnt connect to host [123.123.123.123], Elasticsearch down?
(123.123.123.123 - is fake ip address)
This problem don't generate any log entry in log file.
Elasticsearch is not down after this problem happen.
Did anyone have same problem?
Can anyone help me to find what is the reason of this problem?
contrary to query_string (the query type the q URL parameter maps to) the
term query is not analyzed, thus in your case it searches for "kim*" token
(note, this means it is not performing wildcard query).
I'm 100% sure it's not PHP memory problem.
PHP setting "memory_limit" is set to 128M in php.ini file.
Script that break with error "Couldnt connect to host
[123.123.123.123], Elasticsearch down?" use only 20.75 megabytes
So it's not the reason of problem.
This could be a memory problem in PHP itself. So you could findout if
that's
the case. Because insert such many items in 1 array for example, will
also
need such amount of space of memory.
I'm using latest elasticsearch release 0.12.0
I have problem with insert data to elasticsearch.
I'm using "elasticsearch" PHP client to insert data.
Insert data is done in loop.
PHP script insert 0 - 10000 entries and reload itself.
Next it insert entries from 10000 - 20000. And it reload and start
from 30000.
Script should works like this to the end. About 190000 entries.
Insert single entry works ok. Problem is that script stops every
time
after insert 28233 entries.
I'm 100% sure it's not problem with inserted data of entry 28233
because starting from 10000 it will go thourgh entry 28233 without
any
problem and will stop on entry 38233. So it always insert only
28233
entries.
PHP client return error.
Couldnt connect to host [123.123.123.123], Elasticsearch down?
(123.123.123.123 - is fake ip address)
This problem don't generate any log entry in log file.
Elasticsearch is not down after this problem happen.
Did anyone have same problem?
Can anyone help me to find what is the reason of this problem?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.