DSE 3.2 SOLR FileNotFoundException

3

刚从3.1更新到DSE 3.2,按照指南运行升级,现在日志中出现了这个异常。当通过SOLR查询时,我们发现数据丢失,但是使用cqlsh或cli查询时,数据是存在的。

ERROR [IndexPool work thread-6] 2013-11-18 22:32:18,748 AbstractSolrSecondaryIndex    .java (line 912) _yaqn8_Lucene41_0.tip
java.io.FileNotFoundException: _yaqn8_Lucene41_0.tip
    at org.apache.lucene.store.bytebuffer.ByteBufferDirectory.fileLength(    ByteBufferDirectory.java:129)
    at org.apache.lucene.store.NRTCachingDirectory.sizeInBytes(NRTCachingDirectory    .java:158)
    at org.apache.lucene.store.NRTCachingDirectory.doCacheWrite(    NRTCachingDirectory.java:289)
    at org.apache.lucene.store.NRTCachingDirectory.createOutput(    NRTCachingDirectory.java:199)
    at org.apache.lucene.store.TrackingDirectoryWrapper.createOutput(    TrackingDirectoryWrapper.java:62)
    at org.apache.lucene.codecs.compressing.CompressingStoredFieldsWriter.<init>(    CompressingStoredFieldsWriter.java:107)
    at com.datastax.bdp.cassandra.index.solr.CassandraStoredFieldsWriter.<init>(    CassandraStoredFieldsWriter.java:25)
    at com.datastax.bdp.cassandra.index.solr.CassandraStoredFieldsFormat.    fieldsWriter(CassandraStoredFieldsFormat.java:39)
    at org.apache.lucene.index.StoredFieldsProcessor.initFieldsWriter(    StoredFieldsProcessor.java:86)
    at org.apache.lucene.index.StoredFieldsProcessor.finishDocument(    StoredFieldsProcessor.java:119)
    at org.apache.lucene.index.TwoStoredFieldsConsumers.finishDocument(    TwoStoredFieldsConsumers.java:65)
    at org.apache.lucene.index.DocFieldProcessor.finishDocument(DocFieldProcessor.    java:274)
    at org.apache.lucene.index.DocumentsWriterPerThread.updateDocument(    DocumentsWriterPerThread.java:274)
    at org.apache.lucene.index.DocumentsWriter.updateDocument(DocumentsWriter.    java:376)
    at org.apache.lucene.index.IndexWriter.updateDocument(IndexWriter.java:1485)
    at org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.    java:201)
    at com.datastax.bdp.cassandra.index.solr.CassandraDirectUpdateHandler2.addDoc(    CassandraDirectUpdateHandler2.java:103)
    at com.datastax.bdp.cassandra.index.solr.AbstractSolrSecondaryIndex.doIndex(    AbstractSolrSecondaryIndex.java:929)
    at com.datastax.bdp.cassandra.index.solr.AbstractSolrSecondaryIndex.    doUpdateOrDelete(AbstractSolrSecondaryIndex.java:586)
    at com.datastax.bdp.cassandra.index.solr.ThriftSolrSecondaryIndex.    updateColumnFamilyIndex(ThriftSolrSecondaryIndex.java:114)
    at com.datastax.bdp.cassandra.index.solr.AbstractSolrSecondaryIndex$3.run(    AbstractSolrSecondaryIndex.java:896)
    at com.datastax.bdp.cassandra.index.solr.concurrent.IndexWorker.run(    IndexWorker.java:38)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
    at java.util.concurrent.FutureTask.run(FutureTask.java:166)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.    java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.    java:615)
    at java.lang.Thread.run(Thread.java:724)    

alo this:

ERROR 22:53:01,426 auto commit error...:org.apache.solr.common.SolrException: org.apache.solr.common.SolrException: Error opening new searcher
    at com.datastax.bdp.cassandra.index.solr.CassandraDirectUpdateHandler2.commit(CassandraDirectUpdateHandler2.java:318)
    at org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
    at java.util.concurrent.FutureTask.run(FutureTask.java:166)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:724)
Caused by: org.apache.solr.common.SolrException: Error opening new searcher
    at org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1457)
    at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1569)
    at org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:557)
    at com.datastax.bdp.cassandra.index.solr.CassandraDirectUpdateHandler2.commit(CassandraDirectUpdateHandler2.java:276)
    ... 9 more
Caused by: java.io.FileNotFoundException: _xfgfw_Lucene41_0.tim
    at org.apache.lucene.store.bytebuffer.ByteBufferDirectory.fileLength(ByteBufferDirectory.java:129)
    at org.apache.lucene.store.NRTCachingDirectory.sizeInBytes(NRTCachingDirectory.java:158)
    at org.apache.lucene.store.NRTCachingDirectory.doCacheWrite(NRTCachingDirectory.java:289)
    at org.apache.lucene.store.NRTCachingDirectory.createOutput(NRTCachingDirectory.java:199)
    at org.apache.lucene.store.TrackingDirectoryWrapper.createOutput(TrackingDirectoryWrapper.java:62)
    at org.apache.lucene.codecs.lucene42.Lucene42FieldInfosWriter.write(Lucene42FieldInfosWriter.java:49)
    at org.apache.lucene.index.DocFieldProcessor.flush(DocFieldProcessor.java:88)
    at org.apache.lucene.index.DocumentsWriterPerThread.flush(DocumentsWriterPerThread.java:493)
    at org.apache.lucene.index.DocumentsWriter.doFlush(DocumentsWriter.java:422)
    at org.apache.lucene.index.DocumentsWriter.flushAllThreads(DocumentsWriter.java:559)
    at org.apache.lucene.index.IndexWriter.getReader(IndexWriter.java:365)
    at org.apache.lucene.index.StandardDirectoryReader.doOpenFromWriter(StandardDirectoryReader.java:270)
    at org.apache.lucene.index.StandardDirectoryReader.doOpenIfChanged(StandardDirectoryReader.java:255)
    at org.apache.lucene.index.DirectoryReader.openIfChanged(DirectoryReader.java:250)
    at org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1393)
    ... 12 more
4个回答

5

这是一个已知问题,在DSE 3.2.1中已得到解决。

我们刚刚发布了3.2.1版本,应该能够解决您的问题。我们的开发人员能够复制堆栈跟踪,并解决了这个问题。我们还修复了重新启动后索引未能正确处理的问题。


当我们使用SOLR接口时,数据丢失了。这是否意味着我需要重建? - Russ Bradberry
谢谢你提供的信息。这正是我为什么问这个问题的原因:http://stackoverflow.com/questions/20051932/cassandra-solr-rolling-upgrade - Russ Bradberry
只是为了明确,DSE 3.2.1 中修复的是异常抛出问题。缺失的数据需要进行单独的调查。 - Sven Delmas
我知道了,好的,现在我正在全面重新索引。如果修复了,我会通知你。 - Russ Bradberry
重新索引解决了数据丢失的问题,但我们仍然会收到异常。这些异常是否有任何副作用? - Russ Bradberry

3
一个解决方法是更改你的 Solr 配置以使用以下方式(我们正在努力修复):
<directoryFactory name="DirectoryFactory" class="solr.StandardDirectoryFactory"/>

这需要重启DSE还是重建索引? - Russ Bradberry
已经进行了更改,但仍然看到异常情况,是否需要重新启动或重新索引? - Russ Bradberry
正在重新构建中,我们会通知您。 - Russ Bradberry
重建索引后,日志仍然显示错误。 - Russ Bradberry
这是与之前相同的堆栈跟踪(哪一个),并且更改了DirectoryFacotry? - Sven Delmas
显示剩余4条评论

3

如何在删除的情况下进行完整的重新索引?只需使用 nodetool rebuild_index my_ks my_cf my_ks.my_cf 命令即可。 - Russ Bradberry
另外,我不确定为什么事情没有正确刷新。在重新启动每个节点之前,我执行了 nodetool drain - Russ Bradberry
完整的重新索引解决了缺失数据的问题,但我们仍然会收到异常。 - Russ Bradberry
异常应该是无害的,只是在日志中有点烦人。 - Zanson

0
如果问题仍然存在,那么需要重新索引 CF。

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接