我在使用Solr索引大量数据时遇到了OOM错误。我知道一般的建议是将索引划分为碎片,但实际上这已经是现状。我正在对碎片进行索引,并且进一步划分此时不是一个选择。我想了解发生了什么、为什么会出现此错误以及是否有其他方法可以处理它,而不仅仅是分割或分配更多内存。
如果这种情况下内存消耗是线性的(或更糟),那我会很难过,我希望它是次线性的。
我的情况是:我正在索引具有随机字符串的文档(因此词典非常大)。每个文档有几个20-30个字符的字段和一个200-500个字符的字段。每个碎片中的索引大小约为250-260GB ,每个Solr实例处理此索引时有大约4GB的内存。当OOM发生时,在重新启动后,Solr HeapDump看起来几乎相同,因此它可能与索引无关,而与Solr Searcher有关。就在OOM之前,heapdump的最大对象看起来像以下内容:
我在这里看不到任何提示如何处理它,除了增加更多的RAM,而在一般情况下这并不是一个解决方案。我想知道发生了什么,为什么Searcher和它的ReadOnlySegmentReaders占用了所有内存,他们真的必须这样吗?我可以做些什么吗?
更新: 我已经用一个较小的词典进行了测试,大约有15万个单词(不是随机单词),我达到了约350GB的索引大小,并没有出现OOME,所以这与索引大小没有直接关联,可能更多地与术语向量大小(唯一术语)有关。但我仍然希望了解我的限制以及如何绕过它们。
如果这种情况下内存消耗是线性的(或更糟),那我会很难过,我希望它是次线性的。
我的情况是:我正在索引具有随机字符串的文档(因此词典非常大)。每个文档有几个20-30个字符的字段和一个200-500个字符的字段。每个碎片中的索引大小约为250-260GB ,每个Solr实例处理此索引时有大约4GB的内存。当OOM发生时,在重新启动后,Solr HeapDump看起来几乎相同,因此它可能与索引无关,而与Solr Searcher有关。就在OOM之前,heapdump的最大对象看起来像以下内容:
<tree type="Heap walker - Biggest objects">
<object leaf="false" class="org.apache.solr.core.SolrCore" objectId="0xf02c" type="instance" retainedBytes="120456864" retainedPercent="97.4">
<outgoing leaf="false" class="org.apache.solr.search.SolrIndexSearcher" objectId="0xfb52" type="instance" retainedBytes="120383232" retainedPercent="97.3" referenceType="not specified" referenceName="[transitive reference]">
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x1018e" type="instance" retainedBytes="8161688" retainedPercent="6.6" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x10185" type="instance" retainedBytes="8148072" retainedPercent="6.6" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x10188" type="instance" retainedBytes="8138232" retainedPercent="6.6" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x10186" type="instance" retainedBytes="8129160" retainedPercent="6.6" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x10191" type="instance" retainedBytes="8124608" retainedPercent="6.6" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x1018a" type="instance" retainedBytes="8123144" retainedPercent="6.6" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x10192" type="instance" retainedBytes="8100904" retainedPercent="6.5" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x10190" type="instance" retainedBytes="8097984" retainedPercent="6.5" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x1018b" type="instance" retainedBytes="8096160" retainedPercent="6.5" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x1018d" type="instance" retainedBytes="8081656" retainedPercent="6.5" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x10187" type="instance" retainedBytes="8042504" retainedPercent="6.5" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x1018c" type="instance" retainedBytes="8039336" retainedPercent="6.5" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x10189" type="instance" retainedBytes="8036952" retainedPercent="6.5" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x1018f" type="instance" retainedBytes="7948568" retainedPercent="6.4" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x10195" type="instance" retainedBytes="832448" retainedPercent="0.7" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x10196" type="instance" retainedBytes="830584" retainedPercent="0.7" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x10194" type="instance" retainedBytes="829232" retainedPercent="0.7" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x10197" type="instance" retainedBytes="828808" retainedPercent="0.7" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x10198" type="instance" retainedBytes="827312" retainedPercent="0.7" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x10199" type="instance" retainedBytes="824736" retainedPercent="0.7" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x1019a" type="instance" retainedBytes="822608" retainedPercent="0.7" referenceType="not specified" referenceName="[transitive reference]"/>
<outgoing leaf="false" class="org.apache.lucene.index.ReadOnlySegmentReader" objectId="0x10193" type="instance" retainedBytes="783424" retainedPercent="0.6" referenceType="not specified" referenceName="[transitive reference]"/>
<cutoff objectCount="96" totalSizeBytes="534976" maximumSingleSizeBytes="87560"/>
</outgoing>
<cutoff objectCount="53" totalSizeBytes="73496" maximumSingleSizeBytes="40992"/>
</object>
<object leaf="false" class="org.mortbay.jetty.webapp.WebAppClassLoader" objectId="0xdf88" type="instance" retainedBytes="420208" retainedPercent="0.3"/>
<object leaf="false" class="org.apache.solr.core.SolrConfig" objectId="0xe5f5" type="instance" retainedBytes="184976" retainedPercent="0.1"/>
.....
jmap简单转储显示如下:
Attaching to process ID 27000, please wait...
Debugger attached successfully.
Server compiler detected.
JVM version is 20.5-b03
using thread-local object allocation.
Parallel GC with 2 thread(s)
Heap Configuration:
MinHeapFreeRatio = 40
MaxHeapFreeRatio = 70
MaxHeapSize = 268435456 (256.0MB)
NewSize = 1310720 (1.25MB)
MaxNewSize = 17592186044415 MB
OldSize = 5439488 (5.1875MB)
NewRatio = 2
SurvivorRatio = 8
PermSize = 21757952 (20.75MB)
MaxPermSize = 85983232 (82.0MB)
Heap Usage:
PS Young Generation
Eden Space:
capacity = 31719424 (30.25MB)
used = 17420488 (16.61347198486328MB)
free = 14298936 (13.636528015136719MB)
54.92056854500258% used
From Space:
capacity = 26673152 (25.4375MB)
used = 10550856 (10.062080383300781MB)
free = 16122296 (15.375419616699219MB)
39.55608995892199% used
To Space:
capacity = 27000832 (25.75MB)
used = 0 (0.0MB)
free = 27000832 (25.75MB)
0.0% used
PS Old Generation
capacity = 178978816 (170.6875MB)
used = 168585552 (160.7757110595703MB)
free = 10393264 (9.911788940429688MB)
94.19302002757689% used
PS Perm Generation
capacity = 42008576 (40.0625MB)
used = 41690016 (39.758697509765625MB)
free = 318560 (0.303802490234375MB)
99.24167865152106% used
我在这里看不到任何提示如何处理它,除了增加更多的RAM,而在一般情况下这并不是一个解决方案。我想知道发生了什么,为什么Searcher和它的ReadOnlySegmentReaders占用了所有内存,他们真的必须这样吗?我可以做些什么吗?
更新: 我已经用一个较小的词典进行了测试,大约有15万个单词(不是随机单词),我达到了约350GB的索引大小,并没有出现OOME,所以这与索引大小没有直接关联,可能更多地与术语向量大小(唯一术语)有关。但我仍然希望了解我的限制以及如何绕过它们。