I already integrated carrot2 with solr-4.x with my customerized chinese tokenizer successfully.
But I run some errors following my series of blogs http://ylzhj02.iteye.com/blog/2152348 to adopt carrot2 to solr-5.1.0
The error is
org.carrot2.util.factory.FallbackFactory; Tokenizer for Chinese Simplified (zh_cn) is not available. This may degrade clustering quality of Chinese Simplified content. Cause: java.lang.NoSuchMethodError: org.apache.lucene.analysis.Tokenizer.<init>(Ljava/io/Reader;)V
The reason is that solr-5.2.1 adopted lucene 5.1.0, however carrot2-3.10.0 used lucene 4.6.0. So the cause is jars uncompatible.
So, the solution is to download the latest version of carrot2
#git clone git://github.com/carrot2/carrot2.git
(3.11.0)
the lucene version is now 5.1.0
#cd carrot2
step 1:
#vi core/carrot2-util-text/src/org/carrot2/text/linguistic/DefaultTokenizerFactory.java
add
import org.carrot2.text.linguistic.lucene.InokChineseTokenizerAdapter;
change
100 map.put(LanguageCode.CHINESE_SIMPLIFIED,
101 new NewClassInstanceFactory<ITokenizer>(ChineseTokenizerAdapter.class));
to
map.put(LanguageCode.CHINESE_SIMPLIFIED,
new NewClassInstanceFactory<ITokenizer>(InokChineseTokenizerAdapter.class));
step 2:
#vi InokChineseTokenizerAdapter.java
#cp chineseTokenizer/InokChineseTokenizerAdapter.java ./core/carrot2-util-text/src/org/carrot2/text/linguistic/lucene/
step 3:
#mkdir lib/org.lionsoul.jcseg
├── build.properties
├── jcseg-core-1.9.6.jar
├── jcseg.LICENSE
└── META-INF
└── MANIFEST.MF
the file and jars is
build.properties
bin.includes = META-INF/,\
jcseg-core-1.9.6.jar,\
jcseg.LICENSE
META-INF/MANIFEST.MF
Manifest-Version: 1.0
Bundle-ManifestVersion: 2
Bundle-Name: Jcseg Tokenizer
Bundle-SymbolicName: org.lionsoul.jcseg
Bundle-Version: 1.9.6
Bundle-ClassPath: jcseg-core-1.9.6.jar
Bundle-Vendor: INokNok Inc.
Bundle-RequiredExecutionEnvironment: JavaSE-1.6
step 4:
modify build.xml
141 <patternset id="lib.test">
142 <include name="core/**/*.jar" />
143 <include name="lib/**/*.jar" />
144 <include name="lib/org.lionsoul.jcseg/*.jar" />
145 <exclude name="lib/org.slf4j/slf4j-nop*" />
146 <include name="applications/carrot2-dcs/**/*.jar" />
147 <include name="applications/carrot2-webapp/lib/*.jar" />
148 <include name="applications/carrot2-benchmarks/lib/*.jar" />
149 </patternset>
173 <patternset id="lib.core">
174 <include name="lib/**/*.jar" />
175 <include name="lib/org.lionsoul.jcseg/*.jar" />
176 <include name="core/carrot2-util-matrix/lib/*.jar" />
177 <patternset refid="lib.core.excludes" />
178 </patternset>
180 <patternset id="lib.core.mini">
181 <include name="lib/**/mahout-*.jar" />
182 <include name="lib/**/jcseg*.jar" />
183 <include name="lib/**/mahout.LICENSE" />
184 <include name="lib/**/colt.LICENSE" />
185 <include name="lib/**/commons-lang*" />
186 <include name="lib/**/guava*" />
187 <include name="lib/**/jackson*" />
188 <include name="lib/**/lucene-snowball*" />
189 <include name="lib/**/lucene.LICENSE" />
190 <include name="lib/**/hppc-*.jar" />
191 <include name="lib/**/hppc*.LICENSE" />
192
193 <include name="lib/**/slf4j-api*.jar" />
194 <include name="lib/**/slf4j-nop*.jar" />
195 <include name="lib/**/slf4j.LICENSE" />
196
197 <include name="lib/**/attributes-binder-*.jar" />
198 </patternset>
199
906 <target name="core" depends="jar, jar.src, lib-no-jar.flattened" description="Builds Carrot2 Java API JAR with dependencies">
907 <delete dir="${api.dir}" failonerror="false" />
908 <mkdir dir="${api.dir}" />
909 <mkdir dir="${api.dir}/lib" />
910 <mkdir dir="${api.dir}/examples" />
911 <mkdir dir="${api.dir}/resources" />
912
913 <patternset id="carrot2.required">
914 <include name="**/jcseg*" />
915 <include name="**/commons-lang*" />
step 6:
#ant jar
#scp tmp/jar/carrot2-core-3.11.0-SNAPSHOT.jar root@192.168.0.135:/opt/solr/contrib/clustering/lib
carrot2-core-3.11.0-SNAPSHOT.jar
restart solr server to test clustering
-----------------------------
An error happans
org.apache.solr.common.SolrException; null:java.lang.RuntimeException: java.lang.NoClassDefFoundError: com
/carrotsearch/hppc/ObjectHashSet
Solution :
#scp lib/com.carrotsearch.hppc/hppc-0.7.1.jar root@192.168.0.135:/opt/solr/contrib/clustering/lib/
hppc-0.7.1.jar
#rm -f opt/solr/contrib/clustering/lib/hppc-0.5.2.jar
------
another error is
java.lang.RuntimeException: java.lang.IllegalAccessError: class
com.carrotsearch.hppc.ObjectHashSet cannot access its superclass com.carrotsearch.hppc.AbstractObjectCollection
The reason is that there is an old hppc-0.5.2.jar in /opt/solr/server/webapps/solr.war
so, Solution is to
#cd /opt/solr/server/solr-webapp/webapp
#rm -f WEB-INF/lib/hppc-0.5.2.jar
#cp hppc-0.7.1.jar WEB-INF/lib
#jar cf solr.war ./
#mv solr.war /opt/solr/server/webapps
restart solr
the error disappers
分享到:
相关推荐
| Spring | MyBatis | Solr | Dubbo | Netty | Kafka | Zookeeper | Nginx | Tomcat | Redis | ## Java - JAVA基础 - JAVA虚拟机 - JAVA并发编程 - JAVA容器类 - Java锁汇总 ## 数据库 - MySQL - MySQL...
ik-analyzer-solr 用于solr 7.x-8.x的ik-analyzer 简介 适应最新版本的solr 7&8; 扩展IK首词库: 分词工具 词库中词的数量 最后更新时间 我知道 27.5万 2012年 毫米段 15.7万 2017年 字 64.2万 2014年 界坝 58.4...
- 将 C:\solr-4.9.0\example\resources\log4j.properties 复制到 C:\apache-tomcat-7.0.53\webapps\solr\WEB-INF\classes 3. **启动与测试** - 启动 Tomcat 服务器 - 在浏览器中访问 http://localhost:8080/solr...
# pinyougou 品优购商城项目 技术选型: zookeeper dubbox分布式框架 AngularJS AngularJS-Select2 Spring Data Solr搜索框架 网页静态化技术Freemarker SpringBoot框架 CORS 跨域解决….zip项目工程资源经过严格...
ik-analyzer分词器,支持solr5-5.x
最新可用已配置好solr的carrot2插件,tomcat里面需配置好solr具体到http://carrot2.github.io/solr-integration-strategies/carrot2-3.8.0/index.html查看
在"solr6--solr-dataimporthandler-scheduler-1.1"这个项目中,我们关注的重点是DIH的调度功能,也就是如何定期自动更新Solr索引。 DataImportHandler(DIH)是Solr的一个插件,用于从关系型数据库或其他结构化数据...
solr-mongo-importer-1.1.0.jar solr-mongo-importer-1.1.0.jar solr-mongo-importer-1.1.0.jar
"apache-solr-dataimportscheduler.jar" 是一个专门为Solr设计的扩展包,用于实现自动化的数据增量更新调度。 首先,我们要理解Solr的数据导入过程。Solr使用DataImportHandler(DIH)来从关系型数据库、XML文件或...
Solr 数据导入调度器(solr-dataimport-scheduler.jar)是一个专门为Apache Solr 7.x版本设计的组件,用于实现数据的定期索引更新。在理解这个知识点之前,我们需要先了解Solr的基本概念以及数据导入处理...
"apache-solr-dataimportscheduler-1.0.zip"是一个官方发布的54l版本,专门针对Solr 5.x的定时索引生成需求。 数据导入调度器(DataImportScheduler)是这个扩展的核心组件,它允许用户根据预设的时间间隔自动执行...
<Context docBase="D:/solr/apache-solr-3.5.0/dist/apache-solr-3.5.0.war" debug="0" crossContext="true"> <Environment name="solr/home" type="java.lang.String" value="D:/solr/apache-solr-3.5.0/example/...
solr-import-export-json最新代码solr-import-export-json最新代码solr-import-export-json最新代码solr-import-export-json最新代码solr-import-export-json最新代码solr-import-export-json最新代码solr-import-...
solr的carrot2需要用到的文件solr-integration-strategies-gh-pages carrot3.9webapp,还有tomcat还有solr4.81请自己下载
在 Solr 的生态系统中,`solr-dataimport-scheduler-1.2.jar` 是一个非常重要的组件,它允许用户定时执行数据导入任务,这对于需要定期更新索引的应用场景尤其有用。这个特定的版本 `1.2` 已经被优化以兼容 `Solr ...
solr-data-import-scheduler-1.1.2,用于solr定时更新索引的jar包,下载后引入到solr本身的dist下面,或者你tomcat项目下面的lib下面
Node.js 的 Solr 模块参考Node.js: : Solr: : 使用npm test运行测试。 如果您没有在 127.0.0.1:8983 上运行 Solr,请编辑“test/common.js”。使用示例请参阅使用测试。 这是一个快速示例: var solr = require ( ...
solr5.0.0 所需jar包,包含 IKAnalyzer-5.0 ,solr-analyzer-extra-5.1.0 , solr-dataimportscheduler-1.1.1,solr-dataimporthandler-5.0.0 ,solr-dataimporthandler-extras-5.0.0