插件名称:hadoop2x-eclipse-plugin
插件地址:https://github.com/winghc/hadoop2x-eclipse-plugin
1.下载并解压hadoop2.x,下载地址http://hadoop.apache.org/releases.html#Download(我下载的是编译好的包)
2.下载并解压eclipse(我的是4.4.1版本,其他的版本类似)
3.下载hadoop2x-eclipse-plugin插件并解压自己喜欢的目录,为了方便表达,我暂时叫他"H2EP_HOME"
4.插件编译需要ant工具,下载地址http://ant.apache.org/bindownload.cgi
配置好ANT_HOME环境变量,指向ant解压地址,配置PATH环境变量,增加%ANT_HOME%\bin(linux环境类似)
5.打开命令行工具,进入“H2EP_HOME”目录;
6.执行ant jar -Dversion=2.x.x -Dhadoop.version=2.x.x -Declipse.home=/opt/eclipse -Dhadoop.home=/usr/share/hadoop
eclipse.home配置成eclipse安装目录
hadoop.home配置成hadoop的解压目录
将2.x.x修改成对应的hadoop的版本号
7.命令行在ivy-resolve-common处卡了
原因是找不到几个依赖包,那几个依赖包可能是换路径了,其实不需要这几个依赖包也可以
解决方案:
修改"H2EP_HOME"\src\contrib\eclipse-plugin\build.xml
找到:
<target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">
去掉depends修改为
<target name="compile" unless="skip.contrib">
8.再次执行第6步的编译命令,会提示copy不到相关jar包的错误,
解决方案:
修改"H2EP_HOME"\ivy\libraries.properties文件,
将报错的jar包版本号跟换成与"HADOOP_HOME"\share\hadoop\common\lib下面jar对应的版本号
此步可能会有多个jar包版本不匹配,需要多次修改
9.再次执行第6步的编译命令,执行成功
在"H2EP_HOME"\build\contrib\eclipse-plugin下生成hadoop-eclipse-plugin-2.x.x.jar插件
10.将hadoop-eclipse-plugin-2.x.x.jar放到eclipse的plugins目录下,启动eclipse
11.打开window===>prefernces,找到Hadoop Map/Reduce选项卡
配置hadoop installation directory目录,指向hadoop的安装目录
12.打开window====>show view====>other,找到Map/Reduce Locations,使其显示
13.在Map/Reduce Locations中右键=====>new hadoop locations,
此时没反应,查看eclipse日志(工作空间\.metadata\.log),发现报错:
java.lang.ClassNotFoundException: org.apache.commons.collections.map.UnmodifiableMap
解决方案:
修改"H2EP_HOME"\src\contrib\eclipse-plugin\build.xml
增加:
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<jar>标签增加
lib/commons-collections-${commons-collections.version}.jar,
14.执行 eclipse.exe -clean(清理一下缓存,不然有可能还是出现13步的问题)启动eclipse
完整的build.xml如下("H2EP_HOME"\src\contrib\eclipse-plugin\build.xml)
<?xml version="1.0" encoding="UTF-8" standalone="no"?> <!-- Licensed to the Apache Software Foundation (ASF) under one or more contributor license agreements. See the NOTICE file distributed with this work for additional information regarding copyright ownership. The ASF licenses this file to You under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. --> <project default="jar" name="eclipse-plugin"> <import file="../build-contrib.xml"/> <path id="eclipse-sdk-jars"> <fileset dir="${eclipse.home}/plugins/"> <include name="org.eclipse.ui*.jar"/> <include name="org.eclipse.jdt*.jar"/> <include name="org.eclipse.core*.jar"/> <include name="org.eclipse.equinox*.jar"/> <include name="org.eclipse.debug*.jar"/> <include name="org.eclipse.osgi*.jar"/> <include name="org.eclipse.swt*.jar"/> <include name="org.eclipse.jface*.jar"/> <include name="org.eclipse.team.cvs.ssh2*.jar"/> <include name="com.jcraft.jsch*.jar"/> </fileset> </path> <path id="hadoop-sdk-jars"> <fileset dir="${hadoop.home}/share/hadoop/mapreduce"> <include name="hadoop*.jar"/> </fileset> <fileset dir="${hadoop.home}/share/hadoop/hdfs"> <include name="hadoop*.jar"/> </fileset> <fileset dir="${hadoop.home}/share/hadoop/common"> <include name="hadoop*.jar"/> </fileset> </path> <!-- Override classpath to include Eclipse SDK jars --> <path id="classpath"> <pathelement location="${build.classes}"/> <!--pathelement location="${hadoop.root}/build/classes"/--> <path refid="eclipse-sdk-jars"/> <path refid="hadoop-sdk-jars"/> </path> <!-- Skip building if eclipse.home is unset. --> <target name="check-contrib" unless="eclipse.home"> <property name="skip.contrib" value="yes"/> <echo message="eclipse.home unset: skipping eclipse plugin"/> </target> <!--<target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">--> <!-- 此处去掉 depends="init, ivy-retrieve-common" --> <target name="compile" unless="skip.contrib"> <echo message="contrib: ${name}"/> <javac encoding="${build.encoding}" srcdir="${src.dir}" includes="**/*.java" destdir="${build.classes}" debug="${javac.debug}" deprecation="${javac.deprecation}"> <classpath refid="classpath"/> </javac> </target> <!-- Override jar target to specify manifest --> <target name="jar" depends="compile" unless="skip.contrib"> <mkdir dir="${build.dir}/lib"/> <copy todir="${build.dir}/lib/" verbose="true"> <fileset dir="${hadoop.home}/share/hadoop/mapreduce"> <include name="hadoop*.jar"/> </fileset> </copy> <copy todir="${build.dir}/lib/" verbose="true"> <fileset dir="${hadoop.home}/share/hadoop/common"> <include name="hadoop*.jar"/> </fileset> </copy> <copy todir="${build.dir}/lib/" verbose="true"> <fileset dir="${hadoop.home}/share/hadoop/hdfs"> <include name="hadoop*.jar"/> </fileset> </copy> <copy todir="${build.dir}/lib/" verbose="true"> <fileset dir="${hadoop.home}/share/hadoop/yarn"> <include name="hadoop*.jar"/> </fileset> </copy> <copy todir="${build.dir}/classes" verbose="true"> <fileset dir="${root}/src/java"> <include name="*.xml"/> </fileset> </copy> <copy file="${hadoop.home}/share/hadoop/common/lib/protobuf-java-${protobuf.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/log4j-${log4j.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/commons-configuration-${commons-configuration.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/commons-lang-${commons-lang.version}.jar" todir="${build.dir}/lib" verbose="true"/> <!-- 此处增加 commons-collections 依赖--> <copy file="${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-core-asl-${jackson.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-mapper-asl-${jackson.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-log4j12-${slf4j-log4j12.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-api-${slf4j-api.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/guava-${guava.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/hadoop-auth-${hadoop.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/netty-${netty.version}.jar" todir="${build.dir}/lib" verbose="true"/> <jar jarfile="${build.dir}/hadoop-${name}-${version}.jar" manifest="${root}/META-INF/MANIFEST.MF"> <manifest> <attribute name="Bundle-ClassPath" value="classes/, lib/hadoop-mapreduce-client-core-${hadoop.version}.jar, lib/hadoop-mapreduce-client-common-${hadoop.version}.jar, lib/hadoop-mapreduce-client-jobclient-${hadoop.version}.jar, lib/hadoop-auth-${hadoop.version}.jar, lib/hadoop-common-${hadoop.version}.jar, lib/hadoop-hdfs-${hadoop.version}.jar, lib/protobuf-java-${protobuf.version}.jar, lib/log4j-${log4j.version}.jar, lib/commons-cli-1.2.jar, lib/commons-configuration-1.6.jar, lib/commons-httpclient-3.1.jar, <!-- 此处修改commons-lang版本--> lib/commons-lang-${commons-lang.version}.jar, <!-- 此处增加commons-collections依赖--> lib/commons-collections-${commons-collections.version}.jar, lib/jackson-core-asl-1.8.8.jar, lib/jackson-mapper-asl-1.8.8.jar, lib/slf4j-log4j12-1.7.5.jar, lib/slf4j-api-1.7.5.jar, lib/guava-${guava.version}.jar, lib/netty-${netty.version}.jar"/> </manifest> <fileset dir="${build.dir}" includes="classes/ lib/"/> <!--fileset dir="${build.dir}" includes="*.xml"/--> <fileset dir="${root}" includes="resources/ plugin.xml"/> </jar> </target> </project>
相关推荐
总结,Hadoop Eclipse Plugin 2.6.0是Hadoop开发中的得力助手,它简化了开发流程,增强了调试功能,使得开发者能够更专注于业务逻辑的实现,提高了开发效率和质量。对于Hadoop初学者和专业开发者来说,熟练掌握并...
本文将详细介绍如何编译Hadoop_eclipse-plugin,以便在Eclipse中使用。 首先,你需要准备一个全新的Eclipse工作空间(workspace),并将Hadoop程序的全部源代码复制到这个工作空间中。这是为了确保插件的编译与你的...
安装Hadoop Eclipse Plugin 2.7.2的过程相对简单,只需将下载的`hadoop-eclipse-plugin-2.7.2.jar`文件复制到Eclipse的plugins目录下,然后重启Eclipse即可。当然,确保你的Eclipse版本与Hadoop插件兼容是至关重要的...
《Hadoop Eclipse Plugin 2.7.0:高效开发与调试工具》 Hadoop Eclipse Plugin 2.7.0是一款专门为Hadoop生态系统设计的Eclipse集成插件,它极大地简化了开发者在Eclipse环境中对Hadoop应用程序的创建、调试和管理...
《Hadoop Eclipse Plugin 2.7.7:高效开发与集成环境》 Hadoop Eclipse Plugin 2.7.7.jar 是一个专为Java开发者设计的工具,它将Hadoop的开发环境与Eclipse IDE紧密集成,极大地提高了在Windows 10系统上进行Hadoop...
Hadoop Eclipse Plugin是Apache Hadoop项目的一个重要组成部分,主要用于在Eclipse集成开发环境中方便地创建、管理和调试Hadoop MapReduce程序。2.7.2版本是这个插件的一个稳定版本,提供了对Hadoop 2.x系列的支持。...
hadoop-eclipse-plugin.jar插件基于Ubuntu18.04和Hadoop-3.2.1编译的,最后可以在eclipse创建Map Reduce文件
Hadoop Eclipse Plugin的出现解决了这一问题,它允许开发者在熟悉的Eclipse环境中进行Hadoop项目开发,提供了图形化的项目管理、编译、运行等功能。 二、Hadoop Eclipse Plugin的功能 1. **项目创建与管理**:插件...
在实际使用中,`hadoop-eclipse-plugin-2.7.1.jar` 文件作为插件的核心组件,需要正确地安装到Eclipse环境中。这通常涉及以下步骤: 1. 下载`hadoop-eclipse-plugin-2.7.1.jar` 文件。 2. 打开Eclipse,进入“Help...
Hadoop-eclipse-plugin-2.7.2正是为了解决这个问题,它为Eclipse提供了与Hadoop集群无缝对接的功能,使得开发者可以在熟悉的Eclipse环境中编写、调试和运行Hadoop MapReduce程序。 首先,让我们深入了解Hadoop-...
hadoop2x-eclipse-plugin-master,java1.8(64位)编译,可以使用。
最新的hadoop-eclipse-plugin-2.7.4.jar 很好用的hadoop的eclipse插件。自己编译的。 经过测试,使用没有任何问题。 请各位放心使用
网上没找到2.8.1的版本,自己编译,经测试可用。
在安装Hadoop-Eclipse-Plugin-2.6.4.jar之后,开发者可以通过Eclipse的"New -> Project"菜单选择"Hadoop Map/Reduce Project"选项来创建新的MapReduce项目。这个过程会自动为项目设置合适的构建路径和依赖项,使得...
hadoop1.0.0没有提供eclipse插件的jar包,但是提供了源码,编译后,供大家下载。
使用Hadoop Eclipse Plugin 2.6.0首先需要在Eclipse中安装插件,然后配置Hadoop集群的连接信息,包括Hadoop的安装路径、 Namenode 和 JobTracker 的地址。接着,可以创建新的MapReduce项目,编写并编译代码,最后...
1. **安装插件**:通过Eclipse的“Help” > “Install New Software”菜单,添加Hadoop Eclipse Plugin的更新站点,并按照指示进行安装。 2. **配置Hadoop连接**:在Eclipse中设置Hadoop集群的相关配置,如Hadoop...
在下载并安装hadoop-eclipse-plugin-2.8.5.jar后,你需要按照以下步骤配置Eclipse: 1. 将jar包放入Eclipse的plugins目录下。 2. 启动Eclipse,打开"Window"菜单,选择"Preferences",然后在左侧导航栏中找到...
同时,压缩包中的"Hadoop-eclipse-plugin-master"可能包含的是源码版本,需要通过Eclipse的"Import -> Existing Projects into Workspace"功能导入到开发环境中,然后按照项目构建说明进行编译和安装。 总结来说,...
hadoop-eclipse-plugin-2.7.2.jar,编译环境win10-64,ant-1.9.6,eclipse-4.5.2(4.5.0可用,其他未测),hadoop-2.7.2