Pls help me: i insert log to elasticsearch, but it use too much memory, how to solve it?thanks

Dear all:
I insert 10000 logs to elasticsearch, each log is about 2M, and
there are about 3000 keys and values.
when i insert about 20000, it used about 30G memory, and then
elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/f57de01f-7c63-4d88-9bcc-80daf7cc6a1d%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Do you mean that you insert a very big JSON document in elasticsearch or do you insert line by line?

May be you could illustrate a bit more with a log example (please don't attach it but gist it)?

--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet | @elasticsearchfr

Le 8 janvier 2014 at 11:58:06, xjj210130@gmail.com (xjj210130@gmail.com) a écrit:

Dear all:
I insert 10000 logs to elasticsearch, each log is about 2M, and there are about 3000 keys and values.
when i insert about 20000, it used about 30G memory, and then elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.

You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/f57de01f-7c63-4d88-9bcc-80daf7cc6a1d%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/etPan.52cd31a9.257130a3.1449b%40MacBook-Air-de-David.local.
For more options, visit https://groups.google.com/groups/opt_out.

On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:

Dear all:
I insert 10000 logs to elasticsearch, each log is about 2M, and
there are about 3000 keys and values.
when i insert about 20000, it used about 30G memory, and then
elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.

The following is my log format:
{

"user1":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],

}

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/6d215d9b-a194-4a97-9ab5-081d7e8eb3ab%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:

Dear all:
I insert 10000 logs to elasticsearch, each log is about 2M, and
there are about 3000 keys and values.
when i insert about 20000, it used about 30G memory, and then
elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.

The following is my log format:
{

"user1":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],
.........
"product":[{}],
"name":
}

There are about 4000~ 10000 users information, so a log may be 2M
Thanks

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/ac8cc57b-61ca-497e-9a27-4db8870f3916%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Do you insert that using bulk?

--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet | @elasticsearchfr

Le 8 janvier 2014 at 12:29:33, xjj210130@gmail.com (xjj210130@gmail.com) a écrit:

On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:
Dear all:
I insert 10000 logs to elasticsearch, each log is about 2M, and there are about 3000 keys and values.
when i insert about 20000, it used about 30G memory, and then elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.

The following is my log format:
{
"user1":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],
.........
"product":[{}],
"name":[]
}

There are about 4000~ 10000 users information, so a log may be 2M
Thanks

You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/ac8cc57b-61ca-497e-9a27-4db8870f3916%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/etPan.52cd36cd.333ab105.1449b%40MacBook-Air-de-David.local.
For more options, visit https://groups.google.com/groups/opt_out.

On Wednesday, January 8, 2014 7:30:21 PM UTC+8, David Pilato wrote:

Do you insert that using bulk?

--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet https://twitter.com/dadoonet | @elasticsearchfrhttps://twitter.com/elasticsearchfr

Le 8 janvier 2014 at 12:29:33, xjj2...@gmail.com <javascript:> (
xjj2...@gmail.com <javascript:>) a écrit:

On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:

Dear all:
I insert 10000 logs to elasticsearch, each log is about 2M, and
there are about 3000 keys and values.
when i insert about 20000, it used about 30G memory, and then
elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.

The following is my log format:
{

"user1":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],
.........
"product":[{}],
"name":
}

There are about 4000~ 10000 users information, so a log may be 2M
Thanks

You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearc...@googlegroups.com <javascript:>.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/ac8cc57b-61ca-497e-9a27-4db8870f3916%40googlegroups.com
.
For more options, visit https://groups.google.com/groups/opt_out.

No,i insert a log one by one, use thrift to transport the log . I set
heap_size=30G, when i insert 20000, it used 30g memory. I don't change the elasticsearch.yml
except the heap_size ,and thrift.frame.(most of value i use the default
value) Thanks,

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/cc9fbb4a-5eb2-4e3d-afa8-7524165a4e31%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

I use the elasticsearch version is 0.90.2

On Wednesday, January 8, 2014 7:30:21 PM UTC+8, David Pilato wrote:

Do you insert that using bulk?

--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet https://twitter.com/dadoonet | @elasticsearchfrhttps://twitter.com/elasticsearchfr

Le 8 janvier 2014 at 12:29:33, xjj2...@gmail.com <javascript:> (
xjj2...@gmail.com <javascript:>) a écrit:

On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:

Dear all:
I insert 10000 logs to elasticsearch, each log is about 2M, and
there are about 3000 keys and values.
when i insert about 20000, it used about 30G memory, and then
elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.

The following is my log format:
{

"user1":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],
.........
"product":[{}],
"name":
}

There are about 4000~ 10000 users information, so a log may be 2M
Thanks

You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearc...@googlegroups.com <javascript:>.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/ac8cc57b-61ca-497e-9a27-4db8870f3916%40googlegroups.com
.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/66b722eb-0d8b-4bc9-88a2-f13ccd08a92b%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

That was not really my question. Are you using BULK feature?

--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet | @elasticsearchfr

Le 8 janvier 2014 at 12:38:00, xjj210130@gmail.com (xjj210130@gmail.com) a écrit:

I use the elasticsearch version is 0.90.2

On Wednesday, January 8, 2014 7:30:21 PM UTC+8, David Pilato wrote:
Do you insert that using bulk?

--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet | @elasticsearchfr

Le 8 janvier 2014 at 12:29:33, xjj2...@gmail.com (xjj2...@gmail.com) a écrit:

On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:
Dear all:
I insert 10000 logs to elasticsearch, each log is about 2M, and there are about 3000 keys and values.
when i insert about 20000, it used about 30G memory, and then elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.

The following is my log format:
{
"user1":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],
.........
"product":[{}],
"name":[]
}

There are about 4000~ 10000 users information, so a log may be 2M
Thanks

You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearc...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/ac8cc57b-61ca-497e-9a27-4db8870f3916%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/66b722eb-0d8b-4bc9-88a2-f13ccd08a92b%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/etPan.52cd39ee.4353d0cd.1449b%40MacBook-Air-de-David.local.
For more options, visit https://groups.google.com/groups/opt_out.

no, i don't use bulk. You mean i use bulk it maybe solve the problem?Thanks

On Wednesday, January 8, 2014 7:43:41 PM UTC+8, David Pilato wrote:

That was not really my question. Are you using BULK feature?

--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet https://twitter.com/dadoonet | @elasticsearchfrhttps://twitter.com/elasticsearchfr

Le 8 janvier 2014 at 12:38:00, xjj2...@gmail.com <javascript:> (
xjj2...@gmail.com <javascript:>) a écrit:

I use the elasticsearch version is 0.90.2

On Wednesday, January 8, 2014 7:30:21 PM UTC+8, David Pilato wrote:

Do you insert that using bulk?

 -- 

David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet https://twitter.com/dadoonet | @elasticsearchfrhttps://twitter.com/elasticsearchfr

Le 8 janvier 2014 at 12:29:33, xjj2...@gmail.com (xjj2...@gmail.com) a
écrit:

On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:

Dear all:
I insert 10000 logs to elasticsearch, each log is about 2M, and
there are about 3000 keys and values.
when i insert about 20000, it used about 30G memory, and then
elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.

The following is my log format:
{

"user1":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],
.........
"product":[{}],
"name":
}

There are about 4000~ 10000 users information, so a log may be 2M
Thanks

You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearc...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/ac8cc57b-61ca-497e-9a27-4db8870f3916%40googlegroups.com
.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearc...@googlegroups.com <javascript:>.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/66b722eb-0d8b-4bc9-88a2-f13ccd08a92b%40googlegroups.com
.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/ba188b4a-7f99-4194-8002-7a595821c141%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

No. I would have probably recommended in that case to decrease the bulk size.
Are you searching as well or only indexing?

BTW, you should upgrade to latest 0.90.9 version

--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet | @elasticsearchfr

Le 8 janvier 2014 at 12:47:00, xjj210130@gmail.com (xjj210130@gmail.com) a écrit:

no, i don't use bulk. You mean i use bulk it maybe solve the problem?Thanks

On Wednesday, January 8, 2014 7:43:41 PM UTC+8, David Pilato wrote:
That was not really my question. Are you using BULK feature?

--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet | @elasticsearchfr

Le 8 janvier 2014 at 12:38:00, xjj2...@gmail.com (xjj2...@gmail.com) a écrit:

I use the elasticsearch version is 0.90.2

On Wednesday, January 8, 2014 7:30:21 PM UTC+8, David Pilato wrote:
Do you insert that using bulk?

--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet | @elasticsearchfr

Le 8 janvier 2014 at 12:29:33, xjj2...@gmail.com (xjj2...@gmail.com) a écrit:

On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:
Dear all:
I insert 10000 logs to elasticsearch, each log is about 2M, and there are about 3000 keys and values.
when i insert about 20000, it used about 30G memory, and then elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.

The following is my log format:
{
"user1":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],
.........
"product":[{}],
"name":[]
}

There are about 4000~ 10000 users information, so a log may be 2M
Thanks

You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearc...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/ac8cc57b-61ca-497e-9a27-4db8870f3916%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearc...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/66b722eb-0d8b-4bc9-88a2-f13ccd08a92b%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/ba188b4a-7f99-4194-8002-7a595821c141%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/etPan.52cd3d8e.3a95f874.1449b%40MacBook-Air-de-David.local.
For more options, visit https://groups.google.com/groups/opt_out.

I only insert the log to elasticsearch. I will do the following wrok:
1: write the data to elasticsearch.
2: Then to search the data.

Now, when i insert the data to es, It used too much memory. I wonder why
the es use so much memory.
Could you give me some suggestions. Thanks

I use jmap to watch the pid. the result is following:(i change the
heap_size 1G to watch the memory use)

num #instances #bytes Class description

1: 229353 18348240 java.util.WeakHashMap$Entry
2: 229353 12843768 java.util.WeakHashMap
3: 145045 8703384 org.elasticsearch.index.mapper.FieldMapper
4: 229353 7339296 java.lang.ref.ReferenceQueue
5: 235890 5661360
org.elasticsearch.common.collect.RegularImmutableMap$TerminalEntry
6: 229346 5504304 org.apache.lucene.util.CloseableThreadLocal
7: 57303 4125816
org.elasticsearch.index.mapper.core.LongFieldMapper
8: 85939 3836608 char
9: 155465 3731160
org.elasticsearch.common.collect.RegularImmutableMap$NonTerminalEntry
10: 229353 3669648 java.lang.ThreadLocal
11: 229353 3669648 java.lang.ref.ReferenceQueue$Lock
12: 229353 3669648 java.util.concurrent.atomic.AtomicInteger
13: 114662 3669184
org.elasticsearch.index.analysis.NamedAnalyzer
14: 28698 3518912
org.elasticsearch.common.collect.RegularImmutableMap$LinkedEntry
15: 145044 3481056 java.util.Arrays$ArrayList
16: 145044 3481056 org.elasticsearch.index.mapper.FieldMappers
17: 114620 2750880
org.elasticsearch.index.analysis.NumericLongAnalyzer
18: 52044 2081760 org.apache.lucene.document.FieldType
19: 85939 2062536 java.lang.String
20: 57499 1839968
org.elasticsearch.index.mapper.FieldMapper$Names
21: 114683 1834928
org.apache.lucene.analysis.Analyzer$PerFieldReuseStrategy
22: 114662 1834592
org.apache.lucene.analysis.Analyzer$GlobalReuseStrategy
23: 57493 1379832
org.elasticsearch.index.fielddata.FieldDataType
24: 57332 1375968
org.elasticsearch.index.mapper.core.NumberFieldMapper$1
25: 57303 1375272 org.elasticsearch.common.Explicit
26: 14321 1267344 byte
27: 37088 1186816 java.util.HashMap$Entry
28: 14300 915200
org.elasticsearch.index.mapper.object.ObjectMapper
29: 2180 660520 java.lang.Object
30: 14349 573960
org.elasticsearch.common.collect.RegularImmutableMap
31: 16458 526656
org.elasticsearch.common.collect.RegularImmutableList
32: 14314 343536 org.apache.lucene.index.Term
33: 14314 343536 org.apache.lucene.util.BytesRef
34: 14293 343032
org.elasticsearch.common.collect.RegularImmutableMap$EntrySet
35: 14293 343032
org.elasticsearch.common.collect.RegularImmutableAsList
36: 14293 343032
org.elasticsearch.common.collect.ImmutableMapValues
37: 8 279936 java.util.HashMap$Entry
38: 14314 229024 java.lang.Object
39: 14314 229024
org.elasticsearch.common.lucene.search.TermFilter
40: 2164 51936 org.elasticsearch.index.mapper.ObjectMappers
41: 1 16400 java.lang.String
42: 119 8568
org.elasticsearch.index.mapper.core.StringFieldMapper
43: 1 8208
org.elasticsearch.common.jackson.core.sym.CharsToNameCanonicalizer$Bucket
44: 28 1120
org.elasticsearch.common.collect.SingletonImmutableBiMap
45: 14 728 org.elasticsearch.index.mapper.RootMapper
46: 7 728
org.elasticsearch.index.mapper.DocumentMapper
47: 7 672
org.elasticsearch.index.mapper.internal.TimestampFieldMapper
48: 28 672
org.elasticsearch.common.collect.SingletonImmutableSet
49: 7 616
org.elasticsearch.index.mapper.internal.TTLFieldMapper
50: 7 560
org.elasticsearch.index.mapper.internal.SourceFieldMapper
51: 7 560
org.elasticsearch.index.mapper.internal.SizeFieldMapper
52: 7 504
org.elasticsearch.index.mapper.object.RootObjectMapper
53: 7 504
org.elasticsearch.index.mapper.internal.BoostFieldMapper
54: 21 504
org.elasticsearch.index.analysis.FieldNameAnalyzer
55: 14 448
java.util.concurrent.locks.ReentrantLock$NonfairSync
56: 7 392
org.elasticsearch.index.mapper.internal.UidFieldMapper
57: 7 392
org.elasticsearch.index.mapper.internal.IdFieldMapper
58: 7 392
org.elasticsearch.index.mapper.internal.AllFieldMapper
59: 7 392
org.elasticsearch.index.mapper.internal.RoutingFieldMapper
60: 7 392
org.elasticsearch.index.mapper.internal.IndexFieldMapper
61: 8 384 java.util.HashMap
62: 14 336
org.elasticsearch.index.analysis.NumericDateAnalyzer
63: 14 336 java.util.concurrent.CopyOnWriteArrayList
64: 7 336
org.elasticsearch.index.mapper.internal.TypeFieldMapper
65: 14 336
org.elasticsearch.index.analysis.NumericIntegerAnalyzer
66: 14 336
org.elasticsearch.index.analysis.NumericFloatAnalyzer
67: 14 336
org.elasticsearch.common.collect.ImmutableEntry
68: 7 280
org.elasticsearch.index.mapper.DocumentFieldMappers
69: 6 240 java.util.LinkedHashMap$Entry
70: 7 224
java.util.concurrent.ConcurrentHashMap$HashEntry
71: 7 224
org.elasticsearch.index.mapper.DocumentMapper$1
72: 14 224 java.util.concurrent.locks.ReentrantLock
73: 7 168 org.elasticsearch.common.joda.DateMathParser
74: 7 168
org.elasticsearch.index.mapper.internal.UidFieldMapper$1
75: 7 168
org.elasticsearch.common.text.StringAndBytesText
76: 6 144
org.elasticsearch.common.collect.SingletonImmutableList
77: 6 144 org.elasticsearch.common.collect.Tuple
78: 7 112
org.elasticsearch.common.compress.CompressedString
79: 7 112
org.elasticsearch.index.mapper.object.DynamicTemplate
80: 7 112
org.elasticsearch.index.mapper.internal.AnalyzerMapper
81: 4 96
org.elasticsearch.common.jackson.core.sym.CharsToNameCanonicalizer$Bucket
82: 1 72
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask
83: 2 48 java.util.ArrayList
84: 3 48 java.util.HashMap$EntrySet
85: 1 48
java.util.concurrent.ConcurrentHashMap$HashEntry
86: 1 24
java.util.concurrent.Executors$RunnableAdapter
87: 1 16
org.elasticsearch.index.mapper.FieldMapperListener$Aggregator
Total : 3631635 119473232

On Wednesday, January 8, 2014 7:59:09 PM UTC+8, David Pilato wrote:

No. I would have probably recommended in that case to decrease the bulk
size.
Are you searching as well or only indexing?

BTW, you should upgrade to latest 0.90.9 version

--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet https://twitter.com/dadoonet | @elasticsearchfrhttps://twitter.com/elasticsearchfr

Le 8 janvier 2014 at 12:47:00, xjj2...@gmail.com <javascript:> (
xjj2...@gmail.com <javascript:>) a écrit:

no, i don't use bulk. You mean i use bulk it maybe solve the
problem?Thanks

On Wednesday, January 8, 2014 7:43:41 PM UTC+8, David Pilato wrote:

That was not really my question. Are you using BULK feature?

 -- 

David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet https://twitter.com/dadoonet | @elasticsearchfrhttps://twitter.com/elasticsearchfr

Le 8 janvier 2014 at 12:38:00, xjj2...@gmail.com (xjj2...@gmail.com) a
écrit:

I use the elasticsearch version is 0.90.2

On Wednesday, January 8, 2014 7:30:21 PM UTC+8, David Pilato wrote:

Do you insert that using bulk?

 -- 

David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet https://twitter.com/dadoonet | @elasticsearchfrhttps://twitter.com/elasticsearchfr

Le 8 janvier 2014 at 12:29:33, xjj2...@gmail.com (xjj2...@gmail.com) a
écrit:

On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:

Dear all:
I insert 10000 logs to elasticsearch, each log is about 2M, and
there are about 3000 keys and values.
when i insert about 20000, it used about 30G memory, and then
elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.

The following is my log format:
{

"user1":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],
.........
"product":[{}],
"name":
}

There are about 4000~ 10000 users information, so a log may be 2M
Thanks

You received this message because you are subscribed to the Google
Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to elasticsearc...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/ac8cc57b-61ca-497e-9a27-4db8870f3916%40googlegroups.com
.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearc...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/66b722eb-0d8b-4bc9-88a2-f13ccd08a92b%40googlegroups.com
.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearc...@googlegroups.com <javascript:>.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/ba188b4a-7f99-4194-8002-7a595821c141%40googlegroups.com
.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/da04a258-fdd6-4c2c-b462-ac623c6646e2%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

The env is following:
--elasticseasrch v0.90( i use 0.90.9 , the problem is still exist).
-- java version is 1.7.0_45

On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:

Dear all:
I insert 10000 logs to elasticsearch, each log is about 2M, and
there are about 3000 keys and values.
when i insert about 20000, it used about 30G memory, and then
elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/caec9b84-c543-4bb3-8cb0-e90113972716%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Just wondering if you are hitting the same RAM usage when inserting without thrift?
Could you test it?

Could you gist as well what gives:

curl -XGET 'http://localhost:9200/_nodes?all=true&pretty=true'

--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet | @elasticsearchfr

Le 9 janvier 2014 at 07:11:33, xjj210130@gmail.com (xjj210130@gmail.com) a écrit:

The env is following:
--elasticseasrch v0.90( i use 0.90.9 , the problem is still exist).
-- java version is 1.7.0_45

On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:
Dear all:
I insert 10000 logs to elasticsearch, each log is about 2M, and there are about 3000 keys and values.
when i insert about 20000, it used about 30G memory, and then elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.

You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/caec9b84-c543-4bb3-8cb0-e90113972716%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/etPan.52ce4caf.7644a45c.1449b%40MacBook-Air-de-David.local.
For more options, visit https://groups.google.com/groups/opt_out.

Thanks David .

  •   Yes , I test it with curl. If the json data is not too big, There 
    

is no problem. The test json format is following:*
{
"name":["user1","user2","user3",....],

  • "product":{},*
  • "price":{}*
    }

The difference is the two json data is :
The last json data include too many key/value, like the following:

{
"name":["user1","user2","user3",....],

  • "product":{},*
  • "price":{},*
    "attr":{
    "user1":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],
    "user2":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],,
    "user3":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],
    ......
    }
    }

There are more than 3000 items in attr key. So it used too many memory.
Thanks again.

On Thursday, January 9, 2014 3:15:59 PM UTC+8, David Pilato wrote:

Just wondering if you are hitting the same RAM usage when inserting
without thrift?
Could you test it?

Could you gist as well what gives:

curl -XGET 'http://localhost:9200/_nodes?all=true&pretty=true'

--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet https://twitter.com/dadoonet | @elasticsearchfrhttps://twitter.com/elasticsearchfr

Le 9 janvier 2014 at 07:11:33, xjj2...@gmail.com <javascript:> (
xjj2...@gmail.com <javascript:>) a écrit:

The env is following:
--elasticseasrch v0.90( i use 0.90.9 , the problem is still exist).
-- java version is 1.7.0_45

On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:

Dear all:
I insert 10000 logs to elasticsearch, each log is about 2M, and
there are about 3000 keys and values.
when i insert about 20000, it used about 30G memory, and then
elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearc...@googlegroups.com <javascript:>.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/caec9b84-c543-4bb3-8cb0-e90113972716%40googlegroups.com
.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/d8d1c975-a9f2-47c6-97e4-54ba5f163284%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

I see. You probably have to merge mappings with very big mappings!

What is your application searching for? Logs? Users?

--
David :wink:
Twitter : @dadoonet / @elasticsearchfr / @scrutmydocs

Le 9 janv. 2014 à 10:06, xjj210130@gmail.com a écrit :

Thanks David .
Yes , I test it with curl. If the json data is not too big, There is no problem. The test json format is following:
{
"name":["user1","user2","user3",....],
"product":{},
"price":{}
}

The difference is the two json data is :
The last json data include too many key/value, like the following:

{
"name":["user1","user2","user3",....],
"product":{},
"price":{},
"attr":{
"user1":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],
"user2":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],,
"user3":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],
......
}
}

There are more than 3000 items in attr key. So it used too many memory.
Thanks again.

On Thursday, January 9, 2014 3:15:59 PM UTC+8, David Pilato wrote:

Just wondering if you are hitting the same RAM usage when inserting without thrift?
Could you test it?

Could you gist as well what gives:

curl -XGET 'http://localhost:9200/_nodes?all=true&pretty=true'

--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet | @elasticsearchfr

Le 9 janvier 2014 at 07:11:33, xjj2...@gmail.com (xjj2...@gmail.com) a écrit:

The env is following:
--elasticseasrch v0.90( i use 0.90.9 , the problem is still exist).
-- java version is 1.7.0_45

On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:

Dear all:
I insert 10000 logs to elasticsearch, each log is about 2M, and there are about 3000 keys and values.
when i insert about 20000, it used about 30G memory, and then elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.
--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearc...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/caec9b84-c543-4bb3-8cb0-e90113972716%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/d8d1c975-a9f2-47c6-97e4-54ba5f163284%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/9B042F72-32F3-43EE-8BD2-2A32347E0984%40pilato.fr.
For more options, visit https://groups.google.com/groups/opt_out.

Thanks David.

Now, i don't use mapping . I want to do the following works with
elastisearch.

1: query product information according to some keys;
2: query different user's product's price. The product's price is
different for user.
3: query some product for user. some user don't have some product.
..., and so on.
Th e user's number is more than 3000.
Thanks again.
On Thursday, January 9, 2014 5:23:33 PM UTC+8, David Pilato wrote:

I see. You probably have to merge mappings with very big mappings!

What is your application searching for? Logs? Users?

--
David :wink:
Twitter : @dadoonet / @elasticsearchfr / @scrutmydocs

Le 9 janv. 2014 à 10:06, xjj2...@gmail.com <javascript:> a écrit :

Thanks David .

  •   Yes , I test it with curl. If the json data is not too big, There 
    

is no problem. The test json format is following:*
{
"name":["user1","user2","user3",....],

  • "product":{},*
  • "price":{}*
    }

The difference is the two json data is :
The last json data include too many key/value, like the following:

{
"name":["user1","user2","user3",....],

  • "product":{},*
  • "price":{},*
    "attr":{

"user1":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],

"user2":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],,

"user3":{{"costprice":"122"},{"sellprice":"124"},{"stock":"12"},{"sell":"122"},{},{}],
......
}
}

There are more than 3000 items in attr key. So it used too many memory.
Thanks again.

On Thursday, January 9, 2014 3:15:59 PM UTC+8, David Pilato wrote:

Just wondering if you are hitting the same RAM usage when inserting
without thrift?
Could you test it?

Could you gist as well what gives:

curl -XGET 'http://localhost:9200/_nodes?all=true&pretty=true'

--
David Pilato | Technical Advocate | Elasticsearch.com
http://Elasticsearch.com

@dadoonet https://twitter.com/dadoonet | @elasticsearchfrhttps://twitter.com/elasticsearchfr

Le 9 janvier 2014 at 07:11:33, xjj2...@gmail.com (xjj2...@gmail.com) a
écrit:

The env is following:
--elasticseasrch v0.90( i use 0.90.9 , the problem is still exist).
-- java version is 1.7.0_45

On Wednesday, January 8, 2014 6:58:02 PM UTC+8, xjj2...@gmail.com wrote:

Dear all:
I insert 10000 logs to elasticsearch, each log is about 2M, and
there are about 3000 keys and values.
when i insert about 20000, it used about 30G memory, and then
elasticsearch is very slow, and it's hard to insert log.
Could someone help me how to solve it? Thanks very much.

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearc...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/caec9b84-c543-4bb3-8cb0-e90113972716%40googlegroups.com
.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearc...@googlegroups.com <javascript:>.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/d8d1c975-a9f2-47c6-97e4-54ba5f163284%40googlegroups.com
.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/917d78c7-c68c-4361-8fbf-016c00559196%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.