- 浏览: 29436 次
- 性别:
- 来自: 深圳
文章分类
最新评论
-
jnh:
写的不错啊。标题多了一个“的”。官方文档:http://wik ...
使用Git下载Hadoop的到本地Eclipse开发环境 -
camlelxy:
按照lz的配置没有成功 按照文档Setup-----Inst ...
使用Git下载Hadoop的到本地Eclipse开发环境
问题场景
按照官网http://wiki.apache.org/hadoop/EclipseEnvironment指导,要把Hadoop下载到本地,并构建Eclipse开发环境,只需要三条指令:
即可,但当我在本地执行完第二条后,报出如下错误日志信息:
此问题我暂时还未分析具体原因和解决方案,暂时记录下。
展开分析
通过再此使用命令打出错误信息:
得到详细错误信息为:
通过上面错误信息,真方便找到解决方案。
根据上面的提示,访问https://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
得到如下提示信息:
Unlike many other errors, this exception is not generated by the Maven core itself but by a plugin. As a rule of thumb, plugins use this error to signal a problem in their configuration or the information they retrieved from the POM.
这里说的意思是:
这个错误不是Maven本身的错误,根据经验,可能是Maven使用的插件通过这个异常来标识它们没有从POM中获得相关的配置信息。
接下来进一步分析,通过Maven构建Hadoop过程中是否使用了插件。
从错误日志分析,编译过程使用了插件:maven-antrun-plugin。由于编译Hadoop-common过程中出错,所以进一步定位到hadoop-common工程下的POM.xml,可到看到下面信息:
上面是在Pom.xml文件中使用Maven下使用Ant插件,里面有一行:
看着让人不解,联系到HowToContribueToHadoop的文章http://wiki.apache.org/hadoop/HowToContribute可以推知,有可能由于本地没有安装ProtocolBuffers引起的,因为文章内部特别说明了:
如预期,在本地安装好了Protoc Buffer后,后面两条指令顺利执行完整,剩下的就依据官网把目录下的工程导入Eclipse后,就可以在Eclipse下学习调试源码。
按照官网http://wiki.apache.org/hadoop/EclipseEnvironment指导,要把Hadoop下载到本地,并构建Eclipse开发环境,只需要三条指令:
$ git clone git://git.apache.org/hadoop-common.git $ mvn install -DskipTests $ mvn eclipse:eclipse -DdownloadSources=true -DdownloadJavadocs=true
即可,但当我在本地执行完第二条后,报出如下错误日志信息:
[INFO] [INFO] --- maven-antrun-plugin:1.6:run (compile-proto) @ hadoop-common --- [INFO] Executing tasks main: [exec] target/compile-proto.sh: line 17: protoc: command not found [exec] target/compile-proto.sh: line 17: protoc: command not found [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main ................................ SUCCESS [2.389s] [INFO] Apache Hadoop Project POM ......................... SUCCESS [0.698s] [INFO] Apache Hadoop Annotations ......................... SUCCESS [1.761s] [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [0.729s] [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.353s] [INFO] Apache Hadoop Auth ................................ SUCCESS [1.998s] [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [1.227s] [INFO] Apache Hadoop Common .............................. FAILURE [1.132s] [INFO] Apache Hadoop Common Project ...................... SKIPPED [INFO] Apache Hadoop HDFS ................................ SKIPPED [INFO] Apache Hadoop HttpFS .............................. SKIPPED [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED [INFO] Apache Hadoop HDFS Project ........................ SKIPPED [INFO] hadoop-yarn ....................................... SKIPPED [INFO] hadoop-yarn-api ................................... SKIPPED [INFO] hadoop-yarn-common ................................ SKIPPED [INFO] hadoop-yarn-server ................................ SKIPPED [INFO] hadoop-yarn-server-common ......................... SKIPPED [INFO] hadoop-yarn-server-nodemanager .................... SKIPPED [INFO] hadoop-yarn-server-web-proxy ...................... SKIPPED [INFO] hadoop-yarn-server-resourcemanager ................ SKIPPED [INFO] hadoop-yarn-server-tests .......................... SKIPPED [INFO] hadoop-mapreduce-client ........................... SKIPPED [INFO] hadoop-mapreduce-client-core ...................... SKIPPED [INFO] hadoop-yarn-applications .......................... SKIPPED [INFO] hadoop-yarn-applications-distributedshell ......... SKIPPED [INFO] hadoop-yarn-site .................................. SKIPPED [INFO] hadoop-mapreduce-client-common .................... SKIPPED [INFO] hadoop-mapreduce-client-shuffle ................... SKIPPED [INFO] hadoop-mapreduce-client-app ....................... SKIPPED [INFO] hadoop-mapreduce-client-hs ........................ SKIPPED [INFO] hadoop-mapreduce-client-jobclient ................. SKIPPED [INFO] Apache Hadoop MapReduce Examples .................. SKIPPED [INFO] hadoop-mapreduce .................................. SKIPPED [INFO] Apache Hadoop MapReduce Streaming ................. SKIPPED [INFO] Apache Hadoop Distributed Copy .................... SKIPPED [INFO] Apache Hadoop Archives ............................ SKIPPED [INFO] Apache Hadoop Rumen ............................... SKIPPED [INFO] Apache Hadoop Extras .............................. SKIPPED [INFO] Apache Hadoop Tools Dist .......................... SKIPPED [INFO] Apache Hadoop Tools ............................... SKIPPED [INFO] Apache Hadoop Distribution ........................ SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 12.483s [INFO] Finished at: Mon Jan 30 22:57:23 GMT+08:00 2012 [INFO] Final Memory: 24M/81M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127 -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hadoop-common
此问题我暂时还未分析具体原因和解决方案,暂时记录下。
展开分析
通过再此使用命令打出错误信息:
$ mvn install -DskipTests -e
得到详细错误信息为:
[INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 9.387s [INFO] Finished at: Mon Jan 30 23:11:07 GMT+08:00 2012 [INFO] Final Memory: 19M/81M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127 -> [Help 1] org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127 at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59) at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183) at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:319) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156) at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537) at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196) at org.apache.maven.cli.MavenCli.main(MavenCli.java:141) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290) at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230) at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409) at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352) Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: exec returned: 127 at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:283) at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209) ... 19 more Caused by: /Users/apple/Documents/Hadoop-common-dev/hadoop-common/hadoop-common-project/hadoop-common/target/antrun/build-main.xml:23: exec returned: 127 at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:650) at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:676) at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:502) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291) at sun.reflect.GeneratedMethodAccessor16.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) at org.apache.tools.ant.Task.perform(Task.java:348) at org.apache.tools.ant.Target.execute(Target.java:390) at org.apache.tools.ant.Target.performTasks(Target.java:411) at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1397) at org.apache.tools.ant.Project.executeTarget(Project.java:1366) at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:270) ... 21 more [ERROR] [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException [ERROR]
通过上面错误信息,真方便找到解决方案。
根据上面的提示,访问https://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
得到如下提示信息:
Unlike many other errors, this exception is not generated by the Maven core itself but by a plugin. As a rule of thumb, plugins use this error to signal a problem in their configuration or the information they retrieved from the POM.
这里说的意思是:
这个错误不是Maven本身的错误,根据经验,可能是Maven使用的插件通过这个异常来标识它们没有从POM中获得相关的配置信息。
接下来进一步分析,通过Maven构建Hadoop过程中是否使用了插件。
从错误日志分析,编译过程使用了插件:maven-antrun-plugin。由于编译Hadoop-common过程中出错,所以进一步定位到hadoop-common工程下的POM.xml,可到看到下面信息:
<plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-antrun-plugin</artifactId> <executions> <execution> <id>compile-proto</id> <phase>generate-sources</phase> <goals> <goal>run</goal> </goals> <configuration> <target> <echo file="target/compile-proto.sh"> PROTO_DIR=src/main/proto JAVA_DIR=target/generated-sources/java which cygpath 2> /dev/null if [ $? = 1 ]; then IS_WIN=false else IS_WIN=true WIN_PROTO_DIR=`cygpath --windows $PROTO_DIR` WIN_JAVA_DIR=`cygpath --windows $JAVA_DIR` fi mkdir -p $JAVA_DIR 2> /dev/null for PROTO_FILE in `ls $PROTO_DIR/*.proto 2> /dev/null` do if [ "$IS_WIN" = "true" ]; then protoc -I$WIN_PROTO_DIR --java_out=$WIN_JAVA_DIR $PROTO_FILE else protoc -I$PROTO_DIR --java_out=$JAVA_DIR $PROTO_FILE fi done </echo> <exec executable="sh" dir="${basedir}" failonerror="true"> <arg line="target/compile-proto.sh"/> </exec> </target> </configuration> </execution> <execution> <id>compile-test-proto</id> <phase>generate-test-sources</phase> <goals> <goal>run</goal> </goals> <configuration> <target> <echo file="target/compile-test-proto.sh"> PROTO_DIR=src/test/proto JAVA_DIR=target/generated-test-sources/java which cygpath 2> /dev/null if [ $? = 1 ]; then IS_WIN=false else IS_WIN=true WIN_PROTO_DIR=`cygpath --windows $PROTO_DIR` WIN_JAVA_DIR=`cygpath --windows $JAVA_DIR` fi mkdir -p $JAVA_DIR 2> /dev/null for PROTO_FILE in `ls $PROTO_DIR/*.proto 2> /dev/null` do if [ "$IS_WIN" = "true" ]; then protoc -I$WIN_PROTO_DIR --java_out=$WIN_JAVA_DIR $PROTO_FILE else protoc -I$PROTO_DIR --java_out=$JAVA_DIR $PROTO_FILE fi done </echo> <exec executable="sh" dir="${basedir}" failonerror="true"> <arg line="target/compile-test-proto.sh"/> </exec> </target> </configuration> </execution> <execution> <id>save-version</id> <phase>generate-sources</phase> <goals> <goal>run</goal> </goals> <configuration> <target> <mkdir dir="${project.build.directory}/generated-sources/java"/> <exec executable="sh"> <arg line="${basedir}/dev-support/saveVersion.sh ${project.version} ${project.build.directory}/generated-sources/java"/> </exec> </target> </configuration> </execution> <execution> <id>generate-test-sources</id> <phase>generate-test-sources</phase> <goals> <goal>run</goal> </goals> <configuration> <target> <mkdir dir="${project.build.directory}/generated-test-sources/java"/> <taskdef name="recordcc" classname="org.apache.hadoop.record.compiler.ant.RccTask"> <classpath refid="maven.compile.classpath"/> </taskdef> <recordcc destdir="${project.build.directory}/generated-test-sources/java"> <fileset dir="${basedir}/src/test/ddl" includes="**/*.jr"/> </recordcc> </target> </configuration> </execution> <execution> <id>create-log-dir</id> <phase>process-test-resources</phase> <goals> <goal>run</goal> </goals> <configuration> <target> <!-- TODO: there are tests (TestLocalFileSystem#testCopy) that fail if data TODO: from a previous run is present --> <delete dir="${test.build.data}"/> <mkdir dir="${test.build.data}"/> <mkdir dir="${hadoop.log.dir}"/> <copy toDir="${project.build.directory}/test-classes"> <fileset dir="${basedir}/src/main/conf"/> </copy> </target> </configuration> </execution> <execution> <phase>pre-site</phase> <goals> <goal>run</goal> </goals> <configuration> <tasks> <copy file="src/main/resources/core-default.xml" todir="src/site/resources"/> <copy file="src/main/xsl/configuration.xsl" todir="src/site/resources"/> </tasks> </configuration> </execution> </executions> </plugin>
上面是在Pom.xml文件中使用Maven下使用Ant插件,里面有一行:
<echo file="target/compile-proto.sh">
看着让人不解,联系到HowToContribueToHadoop的文章http://wiki.apache.org/hadoop/HowToContribute可以推知,有可能由于本地没有安装ProtocolBuffers引起的,因为文章内部特别说明了:
引用
Hadoop 0.23+ must have Google's ProtocolBuffers for compilation to work.
接下来打算在本地重新安装ProtocolBuffers后再编译部署。
如预期,在本地安装好了Protoc Buffer后,后面两条指令顺利执行完整,剩下的就依据官网把目录下的工程导入Eclipse后,就可以在Eclipse下学习调试源码。
评论
2 楼
jnh
2014-03-27
写的不错啊。
标题多了一个“的”。
官方文档:http://wiki.apache.org/hadoop/EclipseEnvironment
标题多了一个“的”。
官方文档:http://wiki.apache.org/hadoop/EclipseEnvironment
1 楼
camlelxy
2013-12-17
按照lz的配置没有成功 按照文档
Setup
-----
Install protobuf 2.5.0 (Download from http://code.google.com/p/protobuf/downloads/list)
- install the protoc executable (configure, make, make install)
- install the maven artifact (cd java; mvn install)
然后
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib加上这个以后才编译成功
我在ubuntu下编译的开2个窗口就会有问题。安装proto以后到hadoop-common目录下编译必须在同一个session中
Setup
-----
Install protobuf 2.5.0 (Download from http://code.google.com/p/protobuf/downloads/list)
- install the protoc executable (configure, make, make install)
- install the maven artifact (cd java; mvn install)
然后
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib加上这个以后才编译成功
我在ubuntu下编译的开2个窗口就会有问题。安装proto以后到hadoop-common目录下编译必须在同一个session中
发表评论
-
(转)Hadoop即将过时了吗?
2012-08-03 15:55 1487原文地址:http://www.kuqin.com/datab ... -
UNIX高手的10个习惯
2012-01-30 05:19 1083直接阅读原连接: http://www.ibm.com/dev ... -
Hadoop的Eclipse插件报错
2012-01-23 21:26 1038相关错误已在hadoop的bug库中,见连接: https:/ ... -
MacBook Air如何散热?
2012-01-21 21:00 2570使用MacBook Air已经有十几天了,一个感觉就是太爽了, ... -
Unable to load realm info from SCDynamicStore
2012-01-21 20:19 7913操作环境: Mac OS X 10.7.2 hadoop-0. ...
相关推荐
总结来说,Hadoop Eclipse Plugin 1.1.2是Hadoop开发者的得力助手,通过它,开发者可以在熟悉的Eclipse环境中高效地进行Hadoop应用开发,提升工作效率,减少出错几率。无论是初学者还是经验丰富的开发者,都能从中...
Eclipse Hadoop2.7插件是专门为Java开发者设计的一款工具,它允许用户在流行的集成开发环境(IDE)Eclipse中直接进行Hadoop项目的开发、调试和管理。这个插件极大地简化了Hadoop应用程序的创建过程,使得开发人员...
总的来说,这些文档和资料将帮助你搭建一个完整的Hadoop开发环境,从安装Cygwin和Eclipse,到配置Hadoop环境,最后通过运行WordCount实例来验证你的环境是否正确配置。这不仅对于初学者来说是一次很好的学习体验,也...
Eclipse的Hadoop插件是开发Hadoop MapReduce应用程序的重要工具,它允许开发者在熟悉的Eclipse集成开发环境中(IDE)编写、调试和管理Hadoop项目。这个插件专为Hadoop 0.20.2版本设计,并且要求Eclipse版本为3.5。在...
5. **安装与配置**:使用这个压缩包,开发者需要首先在32位Linux环境下安装Eclipse Kepler,然后将提供的插件导入到Eclipse中,按照指定步骤进行配置,以实现Hadoop开发环境的搭建。 6. **应用场景**:此组合常用于...
Hadoop Eclipse Plugin 2.9.0 是一个强大的开发工具,它为Java开发者提供了一个直观的集成环境,使得在Eclipse中直接操作和管理Hadoop集群变得可能。这个插件的重要性在于,它消除了在开发Hadoop应用程序时需要频繁...
Eclipse是一款强大的集成开发环境(IDE),广泛用于Java编程。MyEclipse是Eclipse的扩展,增加了对Web、J2EE和移动应用开发的支持。为了简化Hadoop MapReduce开发,开发者通常会使用特定的插件,比如"Hadoop2x-...
Eclipse是一款强大的集成开发环境(IDE),广泛用于Java应用程序的开发。将Hadoop与Eclipse连接起来,可以方便地在Eclipse中开发、测试和调试Hadoop MapReduce程序。以下是一些关于如何连接Hadoop 2.7.6到Eclipse的...
Eclipse是一款流行的Java集成开发环境(IDE),而`hadoop-eclipse-plugin`是Hadoop与Eclipse之间的桥梁,它允许开发者在Eclipse中直接操作Hadoop集群,进行MapReduce程序的开发和调试。本篇文章将深入探讨如何生成`...
然而,需要注意的是,随着Hadoop版本的更新,以及Eclipse IDE的不断演进,新的开发工具如IntelliJ IDEA等也提供了类似的Hadoop集成功能,开发者可以根据个人喜好和团队需求选择合适的开发环境。
- **HDFS文件操作**: 直接在Eclipse中浏览、上传和下载HDFS文件,无需离开开发环境。 - **JobTracker监控**: 可以实时查看JobTracker的状态和作业进度。 5. **版本控制集成** - **版本控制工具**: 与Git、SVN等...
《Hadoop Eclipse Plugin 2.7.3:开启大数据开发之旅》 在大数据处理的世界里,Hadoop无疑是一个核心组件,而Hadoop Eclipse Plugin 2.7.3则是开发者们亲密无间的合作伙伴。这款插件为Eclipse IDE带来了强大的功能...
在Cygwin环境下,使用`git clone`命令可以将源码拉取到本地。 安装完Hadoop源码后,我们需要配置环境变量,包括JAVA_HOME和HADOOP_HOME,指向Java SDK和Hadoop安装目录。在Cygwin的.bashrc文件中添加相应的路径,并...
Eclipse Hadoop Plugin是开发Hadoop应用程序的重要工具,它为Eclipse集成开发环境(IDE)提供了对Apache Hadoop集群的直接支持。这个插件允许开发者在Eclipse中编写、调试和部署Hadoop MapReduce程序,极大地提升了...
使用Eclipse作为Hadoop开发环境,开发者可以方便地创建、测试和调试MapReduce程序或Spark应用程序。 标签“Java Eclipse Linux版”进一步确认了这是Eclipse的Java开发版本,适用于Linux操作系统。Java是一种跨平台...
4. **Eclipse集成开发环境**:Eclipse是一个流行的Java开发IDE,可以方便地导入和管理项目。在Eclipse中导入修改后的Mahout源码,可以进行调试和测试,同时也可以直接在IDE中执行Mahout程序。 5. **命令行操作**:...
2. **插件开发环境(PDE)升级**:PDE的改进使得开发者能够更方便地创建、管理和更新Eclipse插件,这对于构建Hadoop相关的开发工具和扩展至关重要。 3. **改进的UI性能**:Eclipse 3.5提升了用户界面的响应速度和...
它可以将作业提交到本地或远程的Hadoop集群,提供了方便的调试和测试环境。 3. **Maven或Gradle配置**:为了管理项目依赖,开发者通常会使用Maven或Gradle。在pom.xml或build.gradle文件中,需要指定Hadoop的相关...
描述 "Building Hadoop development environment with eclipse in windows7 operating system" 提到的是如何在Windows 7操作系统下使用Eclipse IDE构建Hadoop开发环境。这涉及到几个关键步骤: 1. **安装Java**: ...
综上所述,这个压缩包可能是针对Linux开发者,特别是那些使用64位系统并从事Hadoop相关工作的人员,提供了一个包含Eclipse基础功能和Hadoop开发工具的完整开发环境。解压后,用户可以通过“eclipse”这个主文件启动...