环境:Win 7 32bit
1、修改hadoop-1.0.3\src\contrib\build-contrib.xml,添加<property name="version" value="1.0.3"/>和<property name="eclipse.home" value="d:\\eclipse"/>
2、在hadoop-1.0.3目录下运行ant compile -logfile error.log,会出现错误:
compile-hdfs-classes:
[javac] D:\hadoop\hadoop-1.0.3\build.xml:576: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds
[javac] Compiling 1 source file to D:\hadoop\hadoop-1.0.3\build\classes
[javac] D:\hadoop\hadoop-1.0.3\build\src\org\apache\hadoop\package-info.java:5: 未结束的字符串字面值
[javac] user="jackdministrator
[javac] ^
[javac] D:\hadoop\hadoop-1.0.3\build\src\org\apache\hadoop\package-info.java:6: 需要为 class、interface 或 enum
[javac] ", date="Sat Jul 7 21:00:12 2012", url="",
[javac] ^
[javac] D:\hadoop\hadoop-1.0.3\build\src\org\apache\hadoop\package-info.java:6: 需要为 class、interface 或 enum
[javac] ", date="Sat Jul 7 21:00:12 2012", url="",
[javac] ^
[javac] D:\hadoop\hadoop-1.0.3\build\src\org\apache\hadoop\package-info.java:6: 未结束的字符串字面值
[javac] ", date="Sat Jul 7 21:00:12 2012", url="",
[javac] ^
[javac] 4 错误
解决方法:修改D:\hadoop\hadoop-1.0.3\src、saveVersion.sh文件,把user=`whoami`改成user=`hadoop`
3、再次运行ant命令
D:\hadoop\hadoop-1.0.3\build.xml:618: Execute failed: java.io.IOException: Cannot run program "autoreconf" (in directory "D:\hadoop\hadoop-1.0.3\src\native"): CreateProcess error=2, ?????????
解决方法:在cygwin中安装autoconf automake libtool
在安装完软件后依旧报该错误,所以决定使用真实的linux环境进行编译。
环境:CentOS 5.6 x84_64
1、修改hadoop-1.0.3\src\contrib\build-contrib.xml,添加<property name="version" value="1.0.3"/>和<property name="eclipse.home" value="/download/eclipse"/>
2、在hadoop-1.0.3目录下运行ant compile -logfile error.log,会出现错误:
D:\hadoop\hadoop-1.0.3\build.xml:618: Execute failed: java.io.IOException: Cannot run program "autoreconf" (in directory "D:\hadoop\hadoop-1.0.3\src\native"): CreateProcess error=2, ?????????
解决方法:yum install autoconf automake libtool
3、再次编译,顺利通过。切换到hadoop-1.0.3\src\contrib\eclipse-plugin目录,执行ant compile -logfile error.log,顺利通过。
如果编译时遇到以下问题:
ivy-download:
[get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
[get] To: /home/hdpusr/workspace/hadoop-1.0.1/ivy/ivy-2.1.0.jar
[get] Error getting http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar to /home/hdpusr/workspace/hadoop-1.0.1/ivy/ivy-2.1.0.jar
BUILD FAILED
java.net.ConnectException: Connection timed out
可以使用命令 ant compile -Doffline=true
ant jar前要把以下jar拷贝到相应位置,否则放入eclipse后会报以下异常:
An internal error occurred during: "Map/Reduce location status updater".
org/codehaus/jackson/map/JsonMappingException
[hadoop@jack lib]$ pwd
/data/soft/hadoop/build/contrib/eclipse-plugin/lib
[hadoop@jack lib]$ ls
commons-cli-1.2.jar commons-httpclient-3.0.1.jar hadoop-core.jar jackson-mapper-asl-1.8.8.jar
commons-configuration-1.6.jar commons-lang-2.4.jar jackson-core-asl-1.8.8.jar
Cannot connect to the Map/Reduce location: hadoop
java.io.IOException: Unknown protocol to DataNode: org.apache.hadoop.mapred.JobSubmissionProtocol
at org.apache.hadoop.hdfs.server.datanode.DataNode.getProtocolVersion(DataNode.java:1759)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
4、把hadoop-1.0.3\hadoop-core-1.0.3.jar拷贝到hadoop-1.0.3\build目录下,在hadoop-1.0.3\src\contrib\eclipse-plugin目录中执行ant jar。
5、在hadoop-1.0.3\build\contrib\eclipse-plugin目录存放已经编译好的hadoop-eclipse-plugin-1.0.3.jar,详见附件
检查hadoop-eclipse-plugin-1.0.3.jar配置文件,必须包含之前拷贝的jar包
拷贝到eclilpse后,在eclipse的菜单栏
Windows-->Preferences-->Hadoop Map/Reduce中配置好hadoop的home目录
2、在hadoop-1.0.3目录下运行ant compile -logfile error.log,会出现错误:
compile-hdfs-classes:
[javac] D:\hadoop\hadoop-1.0.3\build.xml:576: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds
[javac] Compiling 1 source file to D:\hadoop\hadoop-1.0.3\build\classes
[javac] D:\hadoop\hadoop-1.0.3\build\src\org\apache\hadoop\package-info.java:5: 未结束的字符串字面值
[javac] user="jackdministrator
[javac] ^
[javac] D:\hadoop\hadoop-1.0.3\build\src\org\apache\hadoop\package-info.java:6: 需要为 class、interface 或 enum
[javac] ", date="Sat Jul 7 21:00:12 2012", url="",
[javac] ^
[javac] D:\hadoop\hadoop-1.0.3\build\src\org\apache\hadoop\package-info.java:6: 需要为 class、interface 或 enum
[javac] ", date="Sat Jul 7 21:00:12 2012", url="",
[javac] ^
[javac] D:\hadoop\hadoop-1.0.3\build\src\org\apache\hadoop\package-info.java:6: 未结束的字符串字面值
[javac] ", date="Sat Jul 7 21:00:12 2012", url="",
[javac] ^
[javac] 4 错误
解决方法:修改D:\hadoop\hadoop-1.0.3\src、saveVersion.sh文件,把user=`whoami`改成user=`hadoop`
3、再次运行ant命令
D:\hadoop\hadoop-1.0.3\build.xml:618: Execute failed: java.io.IOException: Cannot run program "autoreconf" (in directory "D:\hadoop\hadoop-1.0.3\src\native"): CreateProcess error=2, ?????????
解决方法:在cygwin中安装autoconf automake libtool
在安装完软件后依旧报该错误,所以决定使用真实的linux环境进行编译。
1、修改hadoop-1.0.3\src\contrib\build-contrib.xml,添加<property name="version" value="1.0.3"/>和<property name="eclipse.home" value="d:\\eclipse"/>
相关推荐
在本主题中,我们将深入探讨如何使用Eclipse IDE(版本4.2,也称为Juno)来编译Hadoop 1.0.3项目。Eclipse是一款强大的Java开发工具,而Hadoop则是一个分布式计算框架,广泛应用于大数据处理。通过集成Eclipse的插件...
在实际的开发过程中,为了提高效率并减少错误,Hadoop提供了Eclipse插件,即本文讨论的`hadoop-eclipse-plugin-2.6.0.jar`。这个插件是针对Hadoop 2.6.0版本设计的,主要目标是集成Eclipse IDE,使得开发者可以在...
只需将jar文件(如hadoop-eclipse-plugin-2.8.0.jar)复制到Eclipse的plugins目录下,然后重启Eclipse,即可在“New Project”中看到Hadoop相关的项目类型。在创建新项目时,可以指定Hadoop的配置文件路径,插件会...
windows下hadoop 的 eclipse 插件,eclipse3.7版本测试通过,lib里面包含解决windows下连接linux权限问题的hadoop-core.jar文件。
Eclipse集成Hadoop2.10.0的插件,使用`ant`对hadoop的jar包进行打包并适应Eclipse加载,所以参数里有hadoop和eclipse的目录. 必须注意对于不同的hadoop版本,` HADDOP_INSTALL_PATH/share/hadoop/common/lib`下的jar包...
hadoop 1.0.3 eclipse plugins 插件
在实际使用中,将`hadoop-eclipse-plugin-2.6.0.jar`文件复制到Eclipse的`dropins`目录下,重启Eclipse后,插件会被自动识别并加载。然后,开发者可以在Eclipse的"New"菜单中找到Hadoop相关的项目模板,开始构建...
要在Eclipse中使用Hadoop-Eclipse-Plugin,通常需要先将插件JAR文件添加到Eclipse的插件目录,然后重启Eclipse。接着,需要配置Hadoop的集群信息,包括NameNode和JobTracker的地址,以便插件能够正确连接到Hadoop...
Hadoop1.0.3的eclipse插件,常规links安装方式即可 Eclipse3.6及4.2亲测可用
hadoop-eclipse-plugin-2.7.4.jar和hadoop-eclipse-plugin-2.7.3.jar还有hadoop-eclipse-plugin-2.6.0.jar的插件都在这打包了,都可以用。
在eclipse中搭建hadoop环境,需要安装hadoop-eclipse-pulgin的插件,根据hadoop的版本对应jar包的版本,此为hadoop3.1.2版本的插件。
hadoop-eclipse-plugin-3.1.3,eclipse版本为eclipse-jee-2020-03
网上没找到2.8.1的版本,自己编译,经测试可用。
Ubuntu虚拟机HADOOP集群搭建eclipse环境 hadoop-eclipse-plugin-3.3.1.jar
最新的hadoop-eclipse-plugin-2.7.4.jar 很好用的hadoop的eclipse插件。自己编译的。 经过测试,使用没有任何问题。 请各位放心使用
hadoop-eclipse-plugin.jar插件基于Ubuntu18.04和Hadoop-3.2.1编译的,最后可以在eclipse创建Map Reduce文件
安装Hadoop Eclipse Plugin 2.6.5非常简单,只需要将下载的jar文件(如:hadoop-eclipse-plugin-2.6.5.jar)复制到Eclipse的plugins目录下,然后重启Eclipse即可。安装完成后,用户会在Eclipse的"New"菜单中看到...
hadoop-eclipse-plugin-2.7.3和2.7.7的jar包 hadoop-eclipse-plugin-2.7.3和2.7.7的jar包 hadoop-eclipse-plugin-2.7.3和2.7.7的jar包 hadoop-eclipse-plugin-2.7.3和2.7.7的jar包
《Hadoop-eclipse-plugin-2.7.2:在Eclipse中轻松开发Hadoop应用》 在大数据处理领域,Hadoop作为一个开源的分布式计算框架,因其高效、可扩展的特性而备受青睐。然而,对于开发者而言,有效地集成开发环境至关重要...