`
pavel
  • 浏览: 930437 次
  • 性别: Icon_minigender_1
  • 来自: 北京
社区版块
存档分类
最新评论

Lucene-2.2.0 源代码阅读学习(16)

阅读更多

在接触到索引删除的策略IndexDeletionPolicy 的时候,提到一个提交点(IndexCommitPoint)的概念。在合适的时机,根据策略需求,需要对这些提交点(IndexCommitPoint)执行删除操作。

这些个提交点(IndexCommitPoint)究竟具有怎样的特征呢?

IndexCommitPoint是一个索引提交点的接口类,定义非常简单,如下所示:

package org.apache.lucene.index;

public interface IndexCommitPoint {

   /**
   * 获取与指定的索引提交点相关的索引段文件(这些索引段文件的名称形如segments_N)
   * 例如,我们在测试实例化一个IndexWriter索引器的时候,在创建索引的过程中就生成了索引段文件
   * 参考文章
Lucene-2.2.0 源代码阅读学习(11) ,可以看到生成的索引段文件为segments_1,大小为1K
   */

  public String getSegmentsFileName();

// 删除指定的索引提交点相关的索引段文件
public void delete();
}

实现IndexCommitPoint接口的类为CommitPoint类。CommitPoint类是一个最终类,而且它是作为一个内部类来定义的,那么它的外部类为IndexFileDeleter类。由此可以看出,一些索引提交点(IndexCommitPoint)的存在,是依赖于IndexFileDeleter类的,只有选择了某种索引文件删除策略,才能够构造一个IndexFileDeleter类的实例。倘若初始化了一个IndexFileDeleter类的实例,没有索引删除策略,则这个IndexFileDeleter类的实例根本就没有应用的价值,更不必谈什么索引提交点(IndexCommitPoint)了。

在IndexWriter索引器类中,定义了一个内部成员:

private IndexFileDeleter deleter;

也就是说,一个索引器的实例化必然要初始化一个IndexFileDeleter类的实例,然后在索引器初始化的时候,初始化索引器主要是调用IndexWriter的init方法,而IndexWriter类只定义了两个重载的init方法,他们的声明如下:

private void init(Directory d, Analyzer a, boolean closeDir, IndexDeletionPolicy deletionPolicy, boolean autoCommit)
    throws CorruptIndexException, LockObtainFailedException, IOException ;

private void init(Directory d, Analyzer a, final boolean create, boolean closeDir, IndexDeletionPolicy deletionPolicy, boolean autoCommit)
    throws CorruptIndexException, LockObtainFailedException, IOException;

这里面,最重要的是第二个init方法,该方法才真正地实现了一些索引器的初始化工作,而第一个init方法只是在通过调用IndexReader类的静态方法:

public static boolean indexExists(Directory directory) throws IOException

来判断指定的索引目录中是否存在索引文件,从而间接地调用第二个init方法来初始化一个IndexWriter索引器。

然后,IndexWriter索引器类不同的构造方法根据构造需要,调用上面的两个init方法实现初始化工作。

在上面的第二个init方法中,根据指定的索引文件删除策略,实例化一个IndexFileDeleter:

deleter = new IndexFileDeleter(directory, deletionPolicy == null ? new KeepOnlyLastCommitDeletionPolicy() : deletionPolicy,segmentInfos, infoStream);

其中infoStream是PrintStream类的一个实例,而PrintStream类继承自FilterOutputStream类,即PrintStream是一个文件输出流类。

这里,如果deletionPolicy=null,即构造一个索引器没有指定删除策略,则自动指派其删除策略为KeepOnlyLastCommitDeletionPolicy,否则使用指定的删除策略deletionPolicy。

一个IndexWriter索引器与IndexFileDeleter索引文件删除工具相关,有必要关注一下IndexFileDeleter类的定义,先把它的一个重要的内部类CommitPoint类放在后面学习:

package org.apache.lucene.index;

import org.apache.lucene.index.IndexFileNames;
import org.apache.lucene.index.SegmentInfos;
import org.apache.lucene.index.SegmentInfo;
import org.apache.lucene.store.Directory;

import java.io.IOException;
import java.io.PrintStream;
import java.util.Map;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.ArrayList;
import java.util.Collections;

//    该类对建立索引过程中指定的Directory目录中的索引文件的删除操作进行管理

// 注意:在IndexFileDeleter实例化之前,必须持有write.lock锁

final class IndexFileDeleter {

// 在删除索引文件过程中可能会由于一些I/O等异常删除失败,将删除失败的文件放到deletable列表中,以期待再次尝试删除它们
private List deletable;

// 存储了与一个索引段文件相关的源数据中的文件的个数,即通过这个索引可以检索到的文件的数目,这里refCount的Key是索引文件的名称,Value就是该索引文件被引用的次数
private Map refCounts = new HashMap();

// 当前索引目录下的索引文件列表
private List commits = new ArrayList();

// 在某个检查点(checkpoint)处可能存在修改了引用计数,但是没有生成提交点,要暂时把这些索引文件存放到lastFiles列表中
private List lastFiles = new ArrayList();

// 提交删除指定索引策略下的索引文件列表
private List commitsToDelete = new ArrayList();

private PrintStream infoStream;
private Directory directory;
private IndexDeletionPolicy policy;

void setInfoStream(PrintStream infoStream) {
    this.infoStream = infoStream;
}

private void message(String message) {
    infoStream.println(this + " " + Thread.currentThread().getName() + ": " + message);
}

//================IndexFileDeleter()方法开始================

// 初始化一个IndexFileDeleter实例,初始化要做大量工作
public IndexFileDeleter(Directory directory, IndexDeletionPolicy policy, SegmentInfos segmentInfos, PrintStream infoStream)
    throws CorruptIndexException, IOException {

    this.infoStream = infoStream;
    this.policy = policy;
    this.directory = directory;

    // 第一次遍历索引目录下的索引文件,初始化索引文件索引的文件计数为0
    long currentGen = segmentInfos.getGeneration();    // 获取下一次提交时索引段文件segments_N的版本号

// 初始化一个对索引文件进行过滤的IndexFileNameFilter实例
    IndexFileNameFilter filter = IndexFileNameFilter.getFilter();

    String[] files = directory.list();
    if (files == null)
      throw new IOException("cannot read directory " + directory + ": list() returned null");

    CommitPoint currentCommitPoint = null;

    for(int i=0;i<files.length;i++) {

      String fileName = files[i];

      if (filter.accept(null, fileName) && !fileName.equals(IndexFileNames.SEGMENTS_GEN)) {

       // IndexFileNames.SEGMENTS_GEN常量的值为segments.gen,可以在Lucene-2.2.0 源代码阅读学习(11) 看到生成的segments.gen文件

        // 如果生成的索引文件合法,则添加到一个初始化索引计数为0的RefCount中
        getRefCount(fileName);

        if (fileName.startsWith(IndexFileNames.SEGMENTS)) {

          // This is a commit (segments or segments_N), and
          // it's valid (<= the max gen). Load it, then
          // incref all files it refers to:
          if (SegmentInfos.generationFromSegmentsFileName(fileName) <= currentGen) {
            if (infoStream != null) {
              message("init: load commit \"" + fileName + "\"");
            }
            SegmentInfos sis = new SegmentInfos();
            sis.read(directory, fileName);
            CommitPoint commitPoint = new CommitPoint(sis);
            if (sis.getGeneration() == segmentInfos.getGeneration()) {
              currentCommitPoint = commitPoint;
            }
            commits.add(commitPoint);
            incRef(sis, true);
          }
        }
      }
    }

    if (currentCommitPoint == null) {
      throw new CorruptIndexException("failed to locate current segments_N file");
    }

   // 对索引目录中所有的索引段文件进行排序
    Collections.sort(commits);

    // 删除引用计数为0的索引文件.
    Iterator it = refCounts.keySet().iterator();
    while(it.hasNext()) {
      String fileName = (String) it.next();
      RefCount rc = (RefCount) refCounts.get(fileName);
      if (0 == rc.count) {
        if (infoStream != null) {
          message("init: removing unreferenced file \"" + fileName + "\"");
        }
        deleteFile(fileName);
      }
    }

    // 在索引器启动的时刻根据指定删除策略删除索引文件
    policy.onInit(commits);

    // 索引器启动的时刻成功地删除了索引文件,之后还要盘点当前驻留内存中的SegmentInfos,避免它们仍然使用删除的索引文件
    if (currentCommitPoint.deleted) {
      checkpoint(segmentInfos, false);
    }
   
    deleteCommits();    // 提交删除
}

//================IndexFileDeleter()方法结束================

//   根据索引文件删除策略决定删除的提交点,将commitsToDelete列表中的提交点从每个SegmentInfos中删除掉
private void deleteCommits() throws IOException {

    int size = commitsToDelete.size();

    if (size > 0) {

      // First decref all files that had been referred to by
      // the now-deleted commits:
      for(int i=0;i<size;i++) {
        CommitPoint commit = (CommitPoint) commitsToDelete.get(i);
        if (infoStream != null) {
          message("deleteCommits: now remove commit \"" + commit.getSegmentsFileName() + "\"");
        }
        int size2 = commit.files.size();
        for(int j=0;j<size2;j++) {
          decRef((List) commit.files.get(j));
        }
        decRef(commit.getSegmentsFileName());
      }
      commitsToDelete.clear();

      // Now compact commits to remove deleted ones (保持有序):
      size = commits.size();
      int readFrom = 0;
      int writeTo = 0;
      while(readFrom < size) {
        CommitPoint commit = (CommitPoint) commits.get(readFrom);
        if (!commit.deleted) {
          if (writeTo != readFrom) {
            commits.set(writeTo, commits.get(readFrom));
          }
          writeTo++;
        }
        readFrom++;
      }

      while(size > writeTo) {
        commits.remove(size-1);
        size--;
      }
    }
}

/**
   * 用于检查优化的方法
   * 因为在复杂的操作过程中,可能发生异常,索引目录中可能存在不被引用的索引文件,
   * 应该删除这些无用的索引文件,释放磁盘空间
   */

public void refresh() throws IOException {
    String[] files = directory.list();
    if (files == null)
      throw new IOException("cannot read directory " + directory + ": list() returned null");
    IndexFileNameFilter filter = IndexFileNameFilter.getFilter();
    for(int i=0;i<files.length;i++) {
      String fileName = files[i];
      if (filter.accept(null, fileName) && !refCounts.containsKey(fileName) && !fileName.equals(IndexFileNames.SEGMENTS_GEN)) {
        // 经过过滤、检查,找出残留的无用索引文件,删除他们
        if (infoStream != null) {
          message("refresh: removing newly created unreferenced file \"" + fileName + "\"");
        }
        deleteFile(fileName);
      }
    }
}

/**
   * For definition of "check point" see IndexWriter comments:
   * removed, we decref their files as well.
   */

public void checkpoint(SegmentInfos segmentInfos, boolean isCommit) throws IOException {

    if (infoStream != null) {
      message("now checkpoint \"" + segmentInfos.getCurrentSegmentFileName() + "\" [isCommit = " + isCommit + "]");
    }

    // Try again now to delete any previously un-deletable
    // files (because they were in use, on Windows):

    if (deletable != null) {
      List oldDeletable = deletable;
      deletable = null;
      int size = oldDeletable.size();
      for(int i=0;i<size;i++) {
        deleteFile((String) oldDeletable.get(i));
      }
    }

    // Incref the files:
    incRef(segmentInfos, isCommit);

    if (isCommit) {
      // Append to our commits list:
      commits.add(new CommitPoint(segmentInfos));

      // Tell policy so it can remove commits:
      policy.onCommit(commits);

      // Decref files for commits that were deleted by the policy:
      deleteCommits();
    }

    // DecRef old files from the last checkpoint, if any:
    int size = lastFiles.size();
    if (size > 0) {
      for(int i=0;i<size;i++) {
        decRef((List) lastFiles.get(i));
      }
      lastFiles.clear();
    }

    if (!isCommit) {
      // Save files so we can decr on next checkpoint/commit:
      size = segmentInfos.size();
      for(int i=0;i<size;i++) {
        SegmentInfo segmentInfo = segmentInfos.info(i);
        if (segmentInfo.dir == directory) {
          lastFiles.add(segmentInfo.files());
        }
      }
    }
}

void incRef(SegmentInfos segmentInfos, boolean isCommit) throws IOException {
    int size = segmentInfos.size();
    for(int i=0;i<size;i++) {
      SegmentInfo segmentInfo = segmentInfos.info(i);
      if (segmentInfo.dir == directory) {
        incRef(segmentInfo.files());
      }
    }

    if (isCommit) {
     
// Since this is a commit point, also incref its
      // segments_N file:

      getRefCount(segmentInfos.getCurrentSegmentFileName()).IncRef();
    }
}

// 对列表files中的索引文件,进行批量引用计数加1操作

private void incRef(List files) throws IOException {
    int size = files.size();
    for(int i=0;i<size;i++) {
      String fileName = (String) files.get(i);
      RefCount rc = getRefCount(fileName);
      if (infoStream != null) {
        message(" IncRef \"" + fileName + "\": pre-incr count is " + rc.count);
      }
      rc.IncRef();
    }
}

// 对列表files中的索引文件,进行批量引用计数减1操作

private void decRef(List files) throws IOException {
    int size = files.size();
    for(int i=0;i<size;i++) {
      decRef((String) files.get(i));
    }
}

// 指定索引文件的引用计数减1

private void decRef(String fileName) throws IOException {
    RefCount rc = getRefCount(fileName);
    if (infoStream != null) {
      message(" DecRef \"" + fileName + "\": pre-decr count is " + rc.count);
    }
    if (0 == rc.DecRef()) {
      // 一个索引文件的引用计数为0了,即该索引文件已变成垃圾索引,要删除该索引文件
      deleteFile(fileName);
      refCounts.remove(fileName);
    }
}

void decRef(SegmentInfos segmentInfos) throws IOException {
    final int size = segmentInfos.size();
    for(int i=0;i<size;i++) {
      SegmentInfo segmentInfo = segmentInfos.info(i);
      if (segmentInfo.dir == directory) {
        decRef(segmentInfo.files());
      }
    }
}

// 根据指定的索引文件的名称,获取用于管理该索引文件的引用计数RefCount实例

private RefCount getRefCount(String fileName) {
    RefCount rc;
    if (!refCounts.containsKey(fileName)) {
      rc = new RefCount();
      refCounts.put(fileName, rc);
    } else {
      rc = (RefCount) refCounts.get(fileName);
    }
    return rc;
}

// 从Directory directory目录中删除指定索引文件fileName

private void deleteFile(String fileName)
       throws IOException {
    try {
      if (infoStream != null) {    // 如果输出流保持打开状态
        message("delete \"" + fileName + "\"");
      }
      directory.deleteFile(fileName);
    } catch (IOException e) {    
//   如果删除失败
      if (directory.fileExists(fileName)) {

        // 删除失败索引文件还残留于索引目录中,并且,如果输出流关闭,则提示稍后删除

        if (infoStream != null) {
          message("IndexFileDeleter: unable to remove file \"" + fileName + "\": " + e.toString() + "; Will re-try later.");
        }
        if (deletable == null) {   
//   将删除失败的索引文件添加到列表deletable中
          deletable = new ArrayList();
        }
        deletable.add(fileName);                 
      }
    }
}

/**
   * Blindly delete the files used by the specific segments,
   * with no reference counting and no retry. This is only
   * currently used by writer to delete its RAM segments
   * from a RAMDirectory.
   */

public void deleteDirect(Directory otherDir, List segments) throws IOException {
    int size = segments.size();
    for(int i=0;i<size;i++) {
      List filestoDelete = ((SegmentInfo) segments.get(i)).files();
      int size2 = filestoDelete.size();
      for(int j=0;j<size2;j++) {
        otherDir.deleteFile((String) filestoDelete.get(j));
      }
    }
}

//  RefCount类是用于管理一个索引文件的引用计数的,当然,一个索引文件可能没有被引用过,这时引用计数this.count=0,应该删除掉这个没有意义的索引文件
final private static class RefCount {   

    int count;

    final private int IncRef() {    // 计数加1
      return ++count;
    }

    final private int DecRef() {    // 计数减1
      return --count;
    }
}

}

将静态内部类CommitPoint(是IndexCommitPoint接口的一个具体实现类)单独拿出来看:

/**
   * 保存每个提交点的详细信息,为了更好地在应用删除策略时进行应用提供方便。
   * 该类实现了Comparable接口;该类的实例,即提交点,在放到一个List中的时候,不能有重复的
   */

final private class CommitPoint implements Comparable, IndexCommitPoint {

    long gen;    // 下次提交索引段segments_N的版本
    List files;    // 属于当前索引目录的索引段的一个列表
    String segmentsFileName;    // 一个索引段
    boolean deleted;    // 删除标志

    public CommitPoint(SegmentInfos segmentInfos) throws IOException {
      segmentsFileName = segmentInfos.getCurrentSegmentFileName();
      int size = segmentInfos.size();   
// segmentInfos是一个索引段SegmentInfo的向量
      files = new ArrayList(size);
      gen = segmentInfos.getGeneration();     // 获取下次提交索引段segments_N的版本号
      for(int i=0;i<size;i++) {
        SegmentInfo segmentInfo = segmentInfos.info(i);   
// 从segmentInfos向量列表中取出一个segmentInfo
        if (segmentInfo.dir == directory) {
          files.add(segmentInfo.files());    // 如果该索引段segmentInfo属于该索引目录,则加入到列表files中
        }
      }
    }

    /**
     * 获取与该提交点相关的segments_N索引段
     */

    public String getSegmentsFileName() {
      return segmentsFileName;
    }

    /**
     * 删除一个提交点
     */

    public void delete() {
      if (!deleted) {
        deleted = true;
        commitsToDelete.add(this);
      }
    }

    public int compareTo(Object obj) {
      CommitPoint commit = (CommitPoint) obj;
      if (gen < commit.gen) {
        return -1;
      } else if (gen > commit.gen) {
        return 1;
      } else {
        return 0;
      }
    }
}

分享到:
评论
1 楼 libin2722 2011-04-02  
我这里有一个任务调度,在晚上3点时候会自动将数据库中某表的数据检查一遍,数据量在10000左右,需要更改其中很多数据状态,但每次执行一段时间就会报如下错误,然后就导致系统异常,不能正常访问,如果你能帮帮我,将不胜感激,异常如下:

INFO   | jvm 1    | 2011/04/02 03:00:08 | 2011-04-02 03:00:08,303 [Hibernate Search: Directory writer-1] ERROR te.search.exception.impl.LogErrorHandler -Exception occurred org.apache.lucene.index.CorruptIndexException: failed to locate current segments_N file
INFO   | jvm 1    | 2011/04/02 03:00:08 | Primary Failure:
INFO   | jvm 1    | 2011/04/02 03:00:08 | Entity com.telenavsoftware.dds.entity.Information  Id 4230  Work Type  org.hibernate.search.backend.DeleteLuceneWork
INFO   | jvm 1    | 2011/04/02 03:00:08 | Subsequent failures:
INFO   | jvm 1    | 2011/04/02 03:00:08 | Entity com.telenavsoftware.dds.entity.Information  Id 4230  Work Type  org.hibernate.search.backend.AddLuceneWork
INFO   | jvm 1    | 2011/04/02 03:00:08 |
INFO   | jvm 1    | 2011/04/02 03:00:08 | org.apache.lucene.index.CorruptIndexException: failed to locate current segments_N file
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.index.IndexFileDeleter.<init>(IndexFileDeleter.java:205)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.index.IndexWriter.init(IndexWriter.java:1124)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:882)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.hibernate.search.backend.Workspace.getIndexWriter(Workspace.java:159)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.hibernate.search.backend.impl.lucene.PerDPQueueProcessor.run(PerDPQueueProcessor.java:103)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.FutureTask.run(FutureTask.java:138)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.lang.Thread.run(Thread.java:619)
INFO   | jvm 1    | 2011/04/02 03:00:08 | 2011-04-02 03:00:08,318 [Hibernate Search: Directory writer-1] ERROR .backend.impl.lucene.PerDPQueueProcessor -Unexpected error in Lucene Backend:
INFO   | jvm 1    | 2011/04/02 03:00:08 | org.hibernate.search.SearchException: Unable to remove class com.telenavsoftware.dds.entity.Information#4230 from index.
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.hibernate.search.backend.impl.lucene.works.DeleteExtWorkDelegate.performWork(DeleteExtWorkDelegate.java:77)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.hibernate.search.backend.impl.lucene.PerDPQueueProcessor.run(PerDPQueueProcessor.java:106)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.FutureTask.run(FutureTask.java:138)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.lang.Thread.run(Thread.java:619)
INFO   | jvm 1    | 2011/04/02 03:00:08 | Caused by: java.lang.NullPointerException
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.hibernate.search.backend.impl.lucene.works.DeleteExtWorkDelegate.performWork(DeleteExtWorkDelegate.java:72)
INFO   | jvm 1    | 2011/04/02 03:00:08 | ... 7 more
INFO   | jvm 1    | 2011/04/02 03:00:08 | 2011-04-02 03:00:08,318 [Hibernate Search: Directory writer-1] ERROR te.search.exception.impl.LogErrorHandler -Exception occurred org.hibernate.search.SearchException: Unable to remove class com.telenavsoftware.dds.entity.Information#4230 from index.
INFO   | jvm 1    | 2011/04/02 03:00:08 | Primary Failure:
INFO   | jvm 1    | 2011/04/02 03:00:08 | Entity com.telenavsoftware.dds.entity.Information  Id 4230  Work Type  org.hibernate.search.backend.DeleteLuceneWork
INFO   | jvm 1    | 2011/04/02 03:00:08 | Subsequent failures:
INFO   | jvm 1    | 2011/04/02 03:00:08 | Entity com.telenavsoftware.dds.entity.Information  Id 4230  Work Type  org.hibernate.search.backend.AddLuceneWork
INFO   | jvm 1    | 2011/04/02 03:00:08 | Entity com.telenavsoftware.dds.entity.Information  Id 4230  Work Type  org.hibernate.search.backend.AddLuceneWork
INFO   | jvm 1    | 2011/04/02 03:00:08 |
INFO   | jvm 1    | 2011/04/02 03:00:08 | org.hibernate.search.SearchException: Unable to remove class com.telenavsoftware.dds.entity.Information#4230 from index.
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.hibernate.search.backend.impl.lucene.works.DeleteExtWorkDelegate.performWork(DeleteExtWorkDelegate.java:77)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.hibernate.search.backend.impl.lucene.PerDPQueueProcessor.run(PerDPQueueProcessor.java:106)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.FutureTask.run(FutureTask.java:138)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.lang.Thread.run(Thread.java:619)
INFO   | jvm 1    | 2011/04/02 03:00:08 | Caused by: java.lang.NullPointerException
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.hibernate.search.backend.impl.lucene.works.DeleteExtWorkDelegate.performWork(DeleteExtWorkDelegate.java:72)
INFO   | jvm 1    | 2011/04/02 03:00:08 | ... 7 more
INFO   | jvm 1    | 2011/04/02 03:00:08 | 2011-04-02 03:00:08,319 [Hibernate Search: Directory writer-1] WARN  org.hibernate.search.backend.Workspace   -going to force release of the IndexWriter lock
INFO   | jvm 1    | 2011/04/02 03:00:08 | Automatic audit old information:4456
INFO   | jvm 1    | 2011/04/02 03:00:08 | Automatic audit old information:4471
INFO   | jvm 1    | 2011/04/02 03:00:08 | Automatic audit old information:4475
INFO   | jvm 1    | 2011/04/02 03:00:08 | Automatic audit old information:4478
INFO   | jvm 1    | 2011/04/02 03:00:08 | Automatic audit old information:4488
INFO   | jvm 1    | 2011/04/02 03:00:08 | Automatic audit old information:4489
INFO   | jvm 1    | 2011/04/02 03:00:08 | 2011-04-02 03:00:08,557 [Hibernate Search: Directory writer-1] ERROR te.search.exception.impl.LogErrorHandler -Exception occurred java.io.FileNotFoundException: /opt/telenav/wwwroot/doudoushi.com/data/lucene/com.telenavsoftware.dds.entity.Information/_0_1g.del (No such file or directory)
INFO   | jvm 1    | 2011/04/02 03:00:08 |
INFO   | jvm 1    | 2011/04/02 03:00:08 | java.io.FileNotFoundException: /opt/telenav/wwwroot/doudoushi.com/data/lucene/com.telenavsoftware.dds.entity.Information/_0_1g.del (No such file or directory)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.io.RandomAccessFile.open(Native Method)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.io.RandomAccessFile.<init>(RandomAccessFile.java:212)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.store.SimpleFSDirectory$SimpleFSIndexInput$Descriptor.<init>(SimpleFSDirectory.java:76)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.store.SimpleFSDirectory$SimpleFSIndexInput.<init>(SimpleFSDirectory.java:97)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.store.NIOFSDirectory$NIOFSIndexInput.<init>(NIOFSDirectory.java:87)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.store.NIOFSDirectory.openInput(NIOFSDirectory.java:67)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.store.FSDirectory.openInput(FSDirectory.java:334)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.util.BitVector.<init>(BitVector.java:219)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.index.SegmentReader.loadDeletedDocs(SegmentReader.java:634)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:594)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.index.IndexWriter$ReaderPool.get(IndexWriter.java:616)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.index.IndexWriter$ReaderPool.get(IndexWriter.java:591)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.index.DocumentsWriter.applyDeletes(DocumentsWriter.java:997)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.index.IndexWriter.applyDeletes(IndexWriter.java:4520)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.index.IndexWriter.doFlushInternal(IndexWriter.java:3723)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.index.IndexWriter.doFlush(IndexWriter.java:3565)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.index.IndexWriter.flush(IndexWriter.java:3555)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.index.IndexWriter.prepareCommit(IndexWriter.java:3431)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:3506)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:3477)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.hibernate.search.backend.Workspace.commitIndexWriter(Workspace.java:187)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at org.hibernate.search.backend.impl.lucene.PerDPQueueProcessor.run(PerDPQueueProcessor.java:109)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.FutureTask.run(FutureTask.java:138)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
INFO   | jvm 1    | 2011/04/02 03:00:08 | at java.lang.Thread.run(Thread.java:619)
INFO   | jvm 1    | 2011/04/02 03:00:08 | 2011-04-02 03:00:08,563 [Hibernate Search: Directory writer-1] ERROR te.search.exception.impl.LogErrorHandler -Exception occurred java.io.FileNotFoundException: /opt/telenav/wwwroot/doudoushi.com/data/lucene/com.telenavsoftware.dds.entity.Information/_0_1g.del (No such file or directory)

相关推荐

    lucene-analyzers-2.2.0.jar

    lucene-analyzers-2.2.0.jarlucene-analyzers-2.2.0.jarlucene-analyzers-2.2.0.jarlucene-analyzers-2.2.0.jarlucene-analyzers-2.2.0.jarlucene-analyzers-2.2.0.jarlucene-analyzers-2.2.0.jarlucene-analyzers-...

    lucene-2.2.0zip

    标题中的"lucene-2.2.0zip"指的是Lucene的2.2.0版本,这是一个较早的版本,对于学习和理解Lucene的基础概念非常有帮助。 Lucene 2.2.0的主要特性包括: 1. **全文检索**:Lucene支持对文档内容进行全文检索,允许...

    lucene-highlighter-2.2.0.jar

    lucene-highlighter-2.2.0.jarlucene-highlighter-2.2.0.jarlucene-highlighter-2.2.0.jarlucene-highlighter-2.2.0.jarlucene-highlighter-2.2.0.jarlucene-highlighter-2.2.0.jarlucene-highlighter-2.2.0.jar

    Lucene-2.3.1 源代码阅读学习

    《Lucene-2.3.1 源代码阅读学习》 Lucene是Apache软件基金会的一个开放源码项目,它是一个高性能、全文本搜索库,为开发者提供了在Java应用程序中实现全文检索功能的基础架构。本篇文章将深入探讨Lucene 2.3.1版本...

    lucene-highlighter-2.2.0-src.zip

    《深入解析Lucene高亮显示源码:剖析`lucene-highlighter-2.2.0-src.zip`》 Lucene,作为一个开源全文检索库,以其高效、灵活的特点在信息检索领域广泛应用。在处理搜索结果时,为了提升用户体验,通常会采用高亮...

    lucene-2.2.0-src

    《深入剖析Lucene 2.2.0源代码》 Lucene是一款强大的开源全文搜索引擎库,由Apache软件基金会开发并维护。它为Java开发者提供了一种高性能、可扩展的文本检索核心工具。本文将深入探讨Lucene 2.2.0版本的源代码,...

    基于JAVA的搜索引擎 lucene-2.2.0

    在前面Lucene-2.2.0 源代码阅读学习(1)中,根据Lucene提供的一个Demo,详细分析研究一下索引器org.apache.lucene.index.IndexWriter类,看看它是如果定义的,掌握它建立索引的机制。 通过IndexWriter类的实现源代码...

    lucene-analyzers-smartcn-7.7.0-API文档-中文版.zip

    赠送源代码:lucene-analyzers-smartcn-7.7.0-sources.jar; 赠送Maven依赖信息文件:lucene-analyzers-smartcn-7.7.0.pom; 包含翻译后的API文档:lucene-analyzers-smartcn-7.7.0-javadoc-API文档-中文(简体)版....

    lucene-core-7.7.0-API文档-中文版.zip

    赠送源代码:lucene-core-7.7.0-sources.jar; 赠送Maven依赖信息文件:lucene-core-7.7.0.pom; 包含翻译后的API文档:lucene-core-7.7.0-javadoc-API文档-中文(简体)版.zip; Maven坐标:org.apache.lucene:lucene...

    lucene-5.3.1源代码

    - 通过阅读源代码,可以理解Lucene的内部工作原理,如如何构建索引、执行查询等。 - 分析器部分的源码有助于了解文本预处理过程,包括分词、去除停用词等。 - 探究查询解析器的实现,掌握如何将自然语言转化为...

    lucene-analyzers-common-6.6.0-API文档-中文版.zip

    赠送源代码:lucene-analyzers-common-6.6.0-sources.jar; 赠送Maven依赖信息文件:lucene-analyzers-common-6.6.0.pom; 包含翻译后的API文档:lucene-analyzers-common-6.6.0-javadoc-API文档-中文(简体)版.zip;...

    lucene-core-2.1.0.jar

    这是一个java开发用的.jar文件,用它和Lucene-core-2.0.0.jar可以实现搜索引擎

    lucene-core-7.2.1-API文档-中文版.zip

    赠送源代码:lucene-core-7.2.1-sources.jar; 赠送Maven依赖信息文件:lucene-core-7.2.1.pom; 包含翻译后的API文档:lucene-core-7.2.1-javadoc-API文档-中文(简体)版.zip; Maven坐标:org.apache.lucene:lucene...

    lucene-suggest-6.6.0-API文档-中文版.zip

    赠送源代码:lucene-suggest-6.6.0-sources.jar; 赠送Maven依赖信息文件:lucene-suggest-6.6.0.pom; 包含翻译后的API文档:lucene-suggest-6.6.0-javadoc-API文档-中文(简体)版.zip; Maven坐标:org.apache....

    lucene-backward-codecs-7.3.1-API文档-中英对照版.zip

    赠送源代码:lucene-backward-codecs-7.3.1-sources.jar; 赠送Maven依赖信息文件:lucene-backward-codecs-7.3.1.pom; 包含翻译后的API文档:lucene-backward-codecs-7.3.1-javadoc-API文档-中文(简体)-英语-对照...

    lucene-core-6.6.0-API文档-中文版.zip

    赠送源代码:lucene-core-6.6.0-sources.jar; 赠送Maven依赖信息文件:lucene-core-6.6.0.pom; 包含翻译后的API文档:lucene-core-6.6.0-javadoc-API文档-中文(简体)版.zip; Maven坐标:org.apache.lucene:lucene...

    lucene-spatial-extras-7.3.1-API文档-中英对照版.zip

    赠送源代码:lucene-spatial-extras-7.3.1-sources.jar; 赠送Maven依赖信息文件:lucene-spatial-extras-7.3.1.pom; 包含翻译后的API文档:lucene-spatial-extras-7.3.1-javadoc-API文档-中文(简体)-英语-对照版....

    lucene-memory-6.6.0-API文档-中文版.zip

    赠送源代码:lucene-memory-6.6.0-sources.jar; 赠送Maven依赖信息文件:lucene-memory-6.6.0.pom; 包含翻译后的API文档:lucene-memory-6.6.0-javadoc-API文档-中文(简体)版.zip; Maven坐标:org.apache.lucene:...

    lucene-suggest-7.7.0-API文档-中文版.zip

    赠送源代码:lucene-suggest-7.7.0-sources.jar; 赠送Maven依赖信息文件:lucene-suggest-7.7.0.pom; 包含翻译后的API文档:lucene-suggest-7.7.0-javadoc-API文档-中文(简体)版.zip; Maven坐标:org.apache....

    lucene-analyzers-smartcn-7.7.0-API文档-中英对照版.zip

    赠送源代码:lucene-analyzers-smartcn-7.7.0-sources.jar; 赠送Maven依赖信息文件:lucene-analyzers-smartcn-7.7.0.pom; 包含翻译后的API文档:lucene-analyzers-smartcn-7.7.0-javadoc-API文档-中文(简体)-英语...

Global site tag (gtag.js) - Google Analytics