主要涉及到工具有:hadoop-2.4.0-src.tar.gz、Ant、Maven、JDK、GCC、CMake、openssl
第一步升级系统相关编译所需的软件(升级最新版):
yum install lzo-devel zlib-devel gcc autoconf automake libtool ncurses-devel openssl-devel
wget http://mirrors.cnnic.cn/apache/hadoop/common/hadoop-2.4.0/hadoop-2.4.0-src.tar.gz (源代版)
tar -zxvf hadoop-2.4.0-src.tar.gz
wget http://apache.fayea.com/apache-mirror//ant/binaries/apache-ant-1.9.4-bin.tar.gz
tar -xvf apache-ant-1.9.4-bin.tar.gz
wget http://apache.fayea.com/apache-mirror/maven/maven-3/3.0.5/binaries/apache-maven-3.0.5-bin.tar.gz
tar -xvf apache-maven-3.0.5-bin.tar.gz
vi /etc/profile
export JAVA_HOME=/usr/java/jdk1.7.0_55
export JAVA_BIN=/usr/java/jdk1.7.0_55/bin
export ANT_HOME=/home/hadoop/ant
export MVN_HOME=/home/hadoop/maven
export FINDBUGS_HOME=/home/hadoop/findbugs-2.0.3
export PATH=$PATH:$JAVA_HOME/bin:$ANT_HOME/bin:$MVN_HOME/bin:$FINDBUGS_HOME/bin
生产配置文件:
source /etc/profile
验证是否配置成功
ant –version
mvn -version
findbugs –version
验证结果:
安装protobuf(以root用户登录)
wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
tar zxf protobuf-2.5.0.tar.gz
cd protobuf-2.5.0
./configure
make
make install
protoc --version
安装cmake(以root用户登录)
wget http://www.cmake.org/files/v2.8/cmake-2.8.12.2-Linux-i386.tar.gz
./bootstrap
make
make install
cmake –version
(别一种方法,直接用yum install cmake)
编译Hadoop
mvn package -DskipTests -Pdist,native –Dtar
此时在下载maven依赖所有包及插件
慢慢等待中……
编译成功,检查nativelib 是否编译成功
[root@master hadoop-2.4.1-src]# cd hadoop-dist/target/hadoop-2.4.1/lib/native/
[root@master native]# file libhadoop.so.1.0.0
libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=0xba68c7f46259525c3aae4ebd99e1faf3b6c7e7a6, not stripped
代表编译成功
或者结果如下, 也表示编译成功。
[INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main ................................ SUCCESS [5.731s] [INFO] Apache Hadoop Project POM ......................... SUCCESS [4.215s] [INFO] Apache Hadoop Annotations ......................... SUCCESS [5.122s] [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.548s] [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [4.271s] [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [8.020s] [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [7.431s] [INFO] Apache Hadoop Auth ................................ SUCCESS [7.517s] [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [4.727s] [INFO] Apache Hadoop Common .............................. SUCCESS [2:53.800s] [INFO] Apache Hadoop NFS ................................. SUCCESS [16.696s] [INFO] Apache Hadoop Common Project ...................... SUCCESS [0.042s] [INFO] Apache Hadoop HDFS ................................ SUCCESS [6:07.368s] [INFO] Apache Hadoop HttpFS .............................. SUCCESS [48.810s] [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [20.154s] [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [9.709s] [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.049s] [INFO] hadoop-yarn ....................................... SUCCESS [0.138s] [INFO] hadoop-yarn-api ................................... SUCCESS [2:00.295s] [INFO] hadoop-yarn-common ................................ SUCCESS [1:00.256s] [INFO] hadoop-yarn-server ................................ SUCCESS [0.076s] [INFO] hadoop-yarn-server-common ......................... SUCCESS [21.974s] [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [28.986s] [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [6.791s] [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [13.558s] [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [28.431s] [INFO] hadoop-yarn-server-tests .......................... SUCCESS [2.644s] [INFO] hadoop-yarn-client ................................ SUCCESS [12.729s] [INFO] hadoop-yarn-applications .......................... SUCCESS [0.102s] [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [4.878s] [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [3.103s] [INFO] hadoop-yarn-site .................................. SUCCESS [0.055s] [INFO] hadoop-yarn-project ............................... SUCCESS [6.390s] [INFO] hadoop-mapreduce-client ........................... SUCCESS [0.211s] [INFO] hadoop-mapreduce-client-core ...................... SUCCESS [39.919s] [INFO] hadoop-mapreduce-client-common .................... SUCCESS [34.197s] [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [5.716s] [INFO] hadoop-mapreduce-client-app ....................... SUCCESS [18.761s] [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [17.226s] [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [7.617s] [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [3.211s] [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [12.571s] [INFO] hadoop-mapreduce .................................. SUCCESS [6.483s] [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [10.180s] [INFO] Apache Hadoop Distributed Copy .................... SUCCESS [15.514s] [INFO] Apache Hadoop Archives ............................ SUCCESS [5.243s] [INFO] Apache Hadoop Rumen ............................... SUCCESS [12.533s] [INFO] Apache Hadoop Gridmix ............................. SUCCESS [8.247s] [INFO] Apache Hadoop Data Join ........................... SUCCESS [6.091s] [INFO] Apache Hadoop Extras .............................. SUCCESS [5.339s] [INFO] Apache Hadoop Pipes ............................... SUCCESS [13.666s] [INFO] Apache Hadoop OpenStack support ................... SUCCESS [14.356s] [INFO] Apache Hadoop Client .............................. SUCCESS [14.354s] [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.145s] [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [39.951s] [INFO] Apache Hadoop Tools Dist .......................... SUCCESS [8.662s] [INFO] Apache Hadoop Tools ............................... SUCCESS [0.035s] [INFO] Apache Hadoop Distribution ........................ SUCCESS [1:45.654s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 23:27.228s [INFO] Finished at: Tue Sep 16 23:09:45 HKT 2014 [INFO] Final Memory: 67M/179M [INFO] ------------------------------------------------------------------------
错误1
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 46.796s
[INFO] Finished at: Wed Jun 04 13:28:37 CST 2014
[INFO] Final Memory: 36M/88M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project hadoop-common: Could not resolve dependencies for project org.apache.hadoop:hadoop-common:jar:2.4.0: Failure to find org.apache.commons:commons-compress:jar:1.4.1 in https://repository.apache.org/content/repositories/snapshots was cached in the local repository, resolution will not be reattempted until the update interval of apache.snapshots.https has elapsed or updates are forced -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-common
解决方法:
根据上面日志提示说找不到“org.apache.commons:commons-compress:jar:1.4.1”,
直接将本地(Windows)包复制到Linux系统中,解决了。
错误2
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2:16.693s
[INFO] Finished at: Wed Jun 04 13:56:31 CST 2014
[INFO] Final Memory: 48M/239M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "/home/hadoop/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/native"): error=2, 没有那个文件或目录
[ERROR] around Ant part ...<exec dir="/home/hadoop/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/native" executable="cmake" failonerror="true">... @ 4:133 in /home/hadoop/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-common
解决方法:
是没有安装cmake导致的,再重新安装cmake;参考《5.3.1编译环境准备》
错误3
错误提示是找不到相应的文件和不能创建目录,在网上没有相关错误(根据自己经验修改目录权限为:775,让目录有创建文件或文件夹的权限,另外最好保证hadoop编译目录有2.5G至4G的空间)
chmod -Rf 775 ./ hadoop-2.4.0-src
main:
[mkdir] Created dir: /data/hadoop/hadoop-2.4.0-src/hadoop-tools/hadoop-pipes/target/test-dir
[INFO] Executed tasks
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (make) @ hadoop-pipes ---
[INFO] Executing tasks
错误3
main:
[mkdir] Created dir: /data/hadoop/hadoop-2.4.0-src/hadoop-tools/hadoop-pipes/target/native
[exec] -- The C compiler identification is GNU 4.4.7
[exec] -- The CXX compiler identification is GNU 4.4.7
[exec] -- Check for working C compiler: /usr/bin/cc
[exec] -- Check for working C compiler: /usr/bin/cc -- works
[exec] -- Detecting C compiler ABI info
[exec] -- Detecting C compiler ABI info - done
[exec] -- Check for working CXX compiler: /usr/bin/c++
[exec] -- Check for working CXX compiler: /usr/bin/c++ -- works
[exec] -- Detecting CXX compiler ABI info
[exec] -- Detecting CXX compiler ABI info - done
[exec] CMake Error at /usr/local/share/cmake-2.8/Modules/FindPackageHandleStandardArgs.cmake:108 (message):
[exec] Could NOT find OpenSSL, try to set the path to OpenSSL root folder in the
[exec] system variable OPENSSL_ROOT_DIR (missing: OPENSSL_LIBRARIES
[exec] OPENSSL_INCLUDE_DIR)
[exec] Call Stack (most recent call first):
[exec] /usr/local/share/cmake-2.8/Modules/FindPackageHandleStandardArgs.cmake:315 (_FPHSA_FAILURE_MESSAGE)
[exec] /usr/local/share/cmake-2.8/Modules/FindOpenSSL.cmake:313 (find_package_handle_standard_args)
[exec] CMakeLists.txt:20 (find_package)
[exec]
[exec]
[exec] -- Configuring incomplete, errors occurred!
[exec] See also "/data/hadoop/hadoop-2.4.0-src/hadoop-tools/hadoop-pipes/target/native/CMakeFiles/CMakeOutput.log".
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [13.745s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [5.538s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [7.296s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.568s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [5.858s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [8.541s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [8.337s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [7.348s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [4.926s]
[INFO] Apache Hadoop Common .............................. SUCCESS [2:35.956s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [18.680s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.059s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [5:03.525s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [38.335s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [23.780s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [8.769s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.159s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.134s]
[INFO] hadoop-yarn-api ................................... SUCCESS [2:07.657s]
[INFO] hadoop-yarn-common ................................ SUCCESS [1:10.680s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.165s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [24.174s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [27.293s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [5.177s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [11.399s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [28.384s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [1.346s]
[INFO] hadoop-yarn-client ................................ SUCCESS [12.937s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.108s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [5.303s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [3.212s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.050s]
[INFO] hadoop-yarn-project ............................... SUCCESS [8.638s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.135s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [43.622s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [36.329s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [6.058s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [20.058s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [16.493s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [11.685s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [3.222s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [12.656s]
[INFO] hadoop-mapreduce .................................. SUCCESS [8.060s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [8.994s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [15.886s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [6.659s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [15.722s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [11.778s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [5.953s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [6.414s]
[INFO] Apache Hadoop Pipes ............................... FAILURE [3.746s]
[INFO] Apache Hadoop OpenStack support ................... SKIPPED
[INFO] Apache Hadoop Client .............................. SKIPPED
[INFO] Apache Hadoop Mini-Cluster ........................ SKIPPED
[INFO] Apache Hadoop Scheduler Load Simulator ............ SKIPPED
[INFO] Apache Hadoop Tools Dist .......................... SKIPPED
[INFO] Apache Hadoop Tools ............................... SKIPPED
[INFO] Apache Hadoop Distribution ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 19:43.155s
[INFO] Finished at: Wed Jun 04 17:40:17 CST 2014
[INFO] Final Memory: 79M/239M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1
[ERROR] around Ant part ...<exec dir="/data/hadoop/hadoop-2.4.0-src/hadoop-tools/hadoop-pipes/target/native" executable="cmake" failonerror="true">... @ 5:123 in /data/hadoop/hadoop-2.4.0-src/hadoop-tools/hadoop-pipes/target/antrun/build-main.xml
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
根据网上提示( 下面需要再安装openssl-devel,安装命令yum install openssl-devel,此步不做的话会报如下错误
[exec] CMake Error at /usr/share/cmake/Modules/FindOpenSSL.cmake:66 (MESSAGE):
[exec] Could NOT find OpenSSL
[exec] Call Stack (most recent call first):
[exec] CMakeLists.txt:20 (find_package)
[exec]
[exec]
[exec] -- Configuring incomplete, errors occurred!
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluen ... oExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-pipes
)
错误连接:http://f.dataguru.cn/thread-189176-1-1.html
原因是:在安装openssl-devel,少写一个l,重新安装一下
解决方法:重新安装openssl-devel
yum install openssl-devel
5.3.3 编译总结
1、 必须安装(yum install lzo-devel zlib-devel gcc autoconf automake libtool ncurses-devel openssl-devel)
2、 必须安装(protobuf,CMake)编译工具
3、 必须配置(ANT、MAVEN、FindBugs)
4、 将maven库指向开源中国,这样就可以加快编译速度,即加快下载依赖jar包速度
5、 编译出错需求详细观察出错日志,根据错误日志分析原因再结束百度和Google解决错误;
相关推荐
以下是关于如何在CentOS 7 64位系统上编译Hadoop 2.7.2源码库文件的详细步骤及相关的知识点: 1. **环境准备**:首先确保你的系统安装了必要的开发工具,如GCC、Java开发工具包(JDK)、Maven和Git。可以使用以下...
hadoop编译流程详细说明 apache-ant-1.9.16-bin.tar.gz cmake-3.22.4.tar.gz protobuf-2.5.0.tar.gz snappy-1.1.4.tar.gz cyrus-sasl-2.1.26-23.el7.x86_64.rpm cyrus-sasl-devel-2.1.26-23.el7.x86_64.rpm cyrus-...
- 访问CentOS官方网站下载CentOS 7.4介质,文件名一般为CentOS-7-x86_64-DVD-1708.iso。 3. **安装CentOS 7.4**: - 在VMware中新建虚拟机时选择“自定义”安装。 - 选择“稍后安装操作系统”,在随后的选择中...
集成了hadoop3.2.4,hive3.1.3, spark3.2.1,kyuubi,ozone等组件,完全基于Apache版本使用Ambari2.7.6进行集成,支持centos系的国产操作系统,例如红旗等。同时支持x86和aarch64两种cpu架构,满足国产化改造的需要...
在CentOS 6.5上编译Hadoop是出于兼容性和性能考虑。CentOS是一款基于Red Hat Enterprise Linux的开源操作系统,因其稳定性而被广泛用于服务器和大数据环境。64位版本的Linux系统能够支持更大的内存和更高效的计算,...
在Linux环境中,对Hadoop2源码进行编译以适应64位系统是必要的步骤,因为...通过以上步骤,你就能成功地在64位Linux系统上编译Hadoop2.3.0的源码了。这使得Hadoop可以与你的特定系统架构更好地集成,提供更高效的性能。
ambari-server-2.7.6.0-0.x86_64.rpm ambari-agent-2.7.6.0-0.x86_64.rpm 1.Ambari绝唱版!通过官网 Installation Guide for Ambari 2.7.6 创建的自编译rpm包 2.编译耗时近一周,依赖356+,大文件可以手动下载下来,...
### CentOS7 编译 Spark 2.3.v2 生成安装包 #### 一、概述 本文档将详细介绍如何在 CentOS 7 环境下编译 Spark 2.3 版本,并最终生成可安装的包。此过程涉及安装必要的软件环境(如 Java 8、Maven)以及配置 Spark...
rpm -ivh http://mirrors.sohu.com/centos/5/os/x86_64/CentOS/freetype-devel-2.2.1-28.el5_5.1.x86_64.rpm rpm -ivh http://mirrors.sohu.com/centos/5/os/x86_64/CentOS/libart_lgpl-devel-2.3.17-4.x86_64.rpm ...
文件为百度云下载链接,包含2.4.1 64位和32位,2.6.0 64位,编译环境均为CentOS 64 --编译环境:CentOS6.5 64 hadoop-2.4.1-x64.tar.gz ----2.4.1 64位 hadoop-2.4.1-x86.tar.gz ----2.4.1 32位 hadoop-2.6.0-x64....
亲自编译,绝对可用,编译环境:centos6.7x64,jdk1.8u112,hadoop2.7.3,eclipse-jee-neon-1a-linux-gtk-x86_64
亲自编译的,我的编译环境 eclipse indigo X86_64 版本。centos。
在本文中,我们将深入探讨如何在CentOS 6.7操作系统上,基于Java 1.7环境编译和安装Hadoop 2.7.5。Hadoop是一个开源框架,主要用于处理和存储大量数据,它是大数据处理的核心组件。CentOS 6.7是一个稳定且广泛使用的...
4. `kudu-client-devel-1.7.0+cdh5.15.1+0-1.cdh5.15.1.p0.4.el7.x86_64.rpm`:开发人员使用的Kudu客户端包,包含头文件和库,供开发时链接和编译Kudu应用程序。 5. `kudu-tserver-1.7.0+cdh5.15.1+0-1.cdh5.15.1.p...
在提供的文件名中,"R-patched-3.1.1-55.1.x86_64.rpm" 和 "R-patched-devel-3.1.1-55.1.x86_64.rpm" 分别代表了这两个部分。"R-patched-3.1.1" 是R运行时环境的修补版,版本号为3.1.1,这表明它是R的一个特定稳定...
- 下载 64 位 CentOS-6.4 镜像文件:`CentOS-6.4-x86_64-bin-DVD1.iso` - 将 ISO 镜像文件刻录成光盘进行安装 2. **安装 JDK 1.8** - 解压 JDK 安装包到 `/home/hadoop01/software` 目录 - 使用 `su` 切换至 ...
- 地址:`http://mirror.neu.edu.cn/centos/6.7/isos/x86_64/CentOS-6.7-x86_64-minimal.iso` - 版本选择CentOS 6.7 minimal是为了确保系统资源占用较低,同时保证系统稳定性和兼容性。 - **Hadoop 2.6.3** - ...
warning:/opt/wotung/hadoop-system/mysql-rpm/mysql-community-libs-compat-5.7.18-1.el7.x86_64.rpm: Header V3 DSA/SHA1 Signature, key ID 5072e1f5: NOKEY Preparing #################################[100%...
sudo yum install java-1.7.0-openjdk-devel.x86_64 ``` 安装完成后,需要配置环境变量。在 `/etc/profile` 文件中添加以下内容: ``` export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.19.x86_64 export ...
3. **mysql-5.7.28-1.el7.x86_64.rpm-bundle.tar**: 这是针对RPM系统(如CentOS或Red Hat Enterprise Linux)的MySQL服务器5.7.28版本的安装包集合。安装这个软件包可以提供一个本地的MySQL服务器,为Hive提供元数据...