`
侯上校
  • 浏览: 223506 次
  • 性别: Icon_minigender_1
  • 来自: 上海
社区版块
存档分类
最新评论

编译spark-project的hive

    博客分类:
  • hive
 
阅读更多
mvn clean compile package install -Phadoop-2 -DskipTests

 

main:
   [delete] Deleting directory /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/tmp
   [delete] Deleting directory /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/warehouse
    [mkdir] Created dir: /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/tmp
    [mkdir] Created dir: /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/warehouse
    [mkdir] Created dir: /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/tmp/conf
     [copy] Copying 11 files to /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-packaging ---
[INFO] 
[INFO] --- maven-gpg-plugin:1.4:sign (sign-artifacts) @ hive-packaging ---
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-packaging ---
[INFO] Installing /usr/local/src/spark_hive/hive-release-1.2.1-spark/packaging/pom.xml to /root/.m2/repository/org/spark-project/hive/hive-packaging/1.2.1.spark/hive-packaging-1.2.1.spark.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Hive ............................................... SUCCESS [  2.563 s]
[INFO] Hive Shims Common .................................. SUCCESS [  3.779 s]
[INFO] Hive Shims 0.20S ................................... SUCCESS [  1.568 s]
[INFO] Hive Shims 0.23 .................................... SUCCESS [  5.433 s]
[INFO] Hive Shims Scheduler ............................... SUCCESS [  2.011 s]
[INFO] Hive Shims ......................................... SUCCESS [  1.557 s]
[INFO] Hive Common ........................................ SUCCESS [  5.571 s]
[INFO] Hive Serde ......................................... SUCCESS [  5.134 s]
[INFO] Hive Metastore ..................................... SUCCESS [ 15.928 s]
[INFO] Hive Ant Utilities ................................. SUCCESS [  0.552 s]
[INFO] Spark Remote Client ................................ SUCCESS [  6.468 s]
[INFO] Hive Query Language ................................ SUCCESS [ 48.084 s]
[INFO] Hive Service ....................................... SUCCESS [  5.605 s]
[INFO] Hive Accumulo Handler .............................. SUCCESS [  4.734 s]
[INFO] Hive JDBC .......................................... SUCCESS [ 13.971 s]
[INFO] Hive Beeline ....................................... SUCCESS [  3.101 s]
[INFO] Hive CLI ........................................... SUCCESS [  2.993 s]
[INFO] Hive Contrib ....................................... SUCCESS [  2.797 s]
[INFO] Hive HBase Handler ................................. SUCCESS [  5.414 s]
[INFO] Hive HCatalog ...................................... SUCCESS [  0.950 s]
[INFO] Hive HCatalog Core ................................. SUCCESS [  4.163 s]
[INFO] Hive HCatalog Pig Adapter .......................... SUCCESS [  3.001 s]
[INFO] Hive HCatalog Server Extensions .................... SUCCESS [  3.124 s]
[INFO] Hive HCatalog Webhcat Java Client .................. SUCCESS [  3.362 s]
[INFO] Hive HCatalog Webhcat .............................. SUCCESS [ 12.030 s]
[INFO] Hive HCatalog Streaming ............................ SUCCESS [  3.114 s]
[INFO] Hive HWI ........................................... SUCCESS [  3.020 s]
[INFO] Hive ODBC .......................................... SUCCESS [  2.443 s]
[INFO] Hive Shims Aggregator .............................. SUCCESS [  0.211 s]
[INFO] Hive TestUtils ..................................... SUCCESS [  0.227 s]
[INFO] Hive Packaging ..................................... SUCCESS [  3.342 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:57 min
[INFO] Finished at: 2015-12-17T17:02:52+08:00
[INFO] Final Memory: 393M/11096M
[INFO] ------------------------------------------------------------------------

 

分享到:
评论

相关推荐

    编译的spark-hive_2.11-2.3.0和 spark-hive-thriftserver_2.11-2.3.0.jar

    spark-hive_2.11-2.3.0...spark-hive-thriftserver_2.11-2.3.0.jar log4j-2.15.0.jar slf4j-api-1.7.7.jar slf4j-log4j12-1.7.25.jar curator-client-2.4.0.jar curator-framework-2.4.0.jar curator-recipes-2.4.0.jar

    spark-hive-2.11和spark-sql-以及spark-hadoop包另付下载地址

    在标题"spark-hive-2.11和spark-sql-以及spark-hadoop包另付下载地址"中,我们关注的是Spark与Hive的特定版本(2.11)的集成,以及Spark SQL和Spark对Hadoop的支持。这里的2.11可能指的是Scala的版本,因为Spark是用...

    spark-hive-thriftserver_2.11-2.1.3-SNAPSHOT-123456.jar

    spark-hive-thriftserver_2.11-2.1.spark-hive-thrift

    spark-1.6.3-bin-hadoop2.4-without-hive.tgz

    《Spark 1.6.3 与 Hadoop 2.4 整合:无 Hive 版本解析》 Spark 1.6.3 是 Apache Spark 的一个重要版本,它在大数据处理领域扮演着至关重要的角色。这次我们关注的是一个特别的构建——"spark-1.6.3-bin-hadoop2.4-...

    spark--bin-hadoop3-without-hive.tgz

    本压缩包“spark--bin-hadoop3-without-hive.tgz”提供了Spark二进制版本,针对Hadoop 3.1.3进行了编译和打包,这意味着它已经与Hadoop 3.x兼容,但不包含Hive组件。在CentOS 8操作系统上,这个版本的Spark已经被...

    spark-hive_2.11-2.1.4-SNAPSHOT.rar

    《Spark与Hive的融合:深入理解Spark-Hive 2.11-2.1.4-SNAPSHOT》 在大数据处理领域,Spark和Hive是两个极为重要的工具。Spark以其高效的内存计算和强大的分布式处理能力,成为了实时计算的首选;而Hive则凭借其SQL...

    spark-2.3.0-bin-hadoop2-without-hive

    《Spark 2.3.0 与 Hive 集成详解——无 Hive JAR 包版本》 在大数据处理领域,Spark 和 Hive 是两个至关重要的工具。Spark 提供了高效的数据处理能力,而 Hive 则提供了基于 SQL 的数据查询和管理功能。然而,有时...

    hive-spark-client-3.1.2.jar

    hive-on-spark客户端

    spark-2.3.1-bin-hadoop2.9-without-hive.tgz

    在实际应用中,你可能需要根据项目需求来选择是否集成Hive,如果需要与Hive交互,可能需要自行编译带有Hive支持的Spark版本,或者在运行时通过配置指定Hive的相关路径。总的来说,理解Spark的各个组件以及它们如何...

    spark-3.2.0-bin-hadoop3-without-hive

    《Spark 3.2.0 与 Hadoop 3 的集成——无 Hive 版本解析》 Spark,作为大数据处理领域的重要工具,以其高效的内存计算和分布式数据处理能力备受青睐。Spark 3.2.0 是一个重要的版本更新,它在性能、稳定性和功能上...

    spark-hive-thriftserver_2.11-2.4.5.jar

    spark和hive结合依赖,如何使用请看我博客https://blog.csdn.net/z1987865446/article/details/109372818

    spark--bin-hadoop2-without-hive.tgz

    - **Dataset**:Spark 2.0中引入,结合了DataFrame的API易用性和RDD的性能优势,支持类型安全和编译时检查。 2. **Spark架构**: - **Master和Worker节点**:Spark集群由一个Master节点和多个Worker节点组成,...

    hive3.x编译spark3.x包

    对于Hive,选择3.1.x系列的分支,对于Spark,选择3.0.0或3.1.3版本,这取决于你希望编译的Hive-Spark组合。 3. **应用补丁**:描述中提到的“补丁文件包”可能包含针对Hive和Spark集成的特定修改。这些补丁通常用于...

    spark2.0编译版-适用于hive2.3的hive on spark

    在这个“spark2.0编译版-适用于hive2.3的hive on spark”压缩包中,我们主要关注的是如何在Spark 2.0上运行Hive查询,同时确保Spark中不包含Hive的jar包。这是因为Hive on Spark模式下,Spark作为Hive的执行引擎,但...

    spark-2.0.0-bin-hadoop2-without-hive.tgz

    2. **配置Spark**:在Spark的`conf/spark-defaults.conf`文件中,设置`spark.sql.hive.metastore.uris`来指向你的Hive Metastore服务的Thrift URI。同时,可能还需要指定Hive的库路径,例如`spark.sql.hive....

    spark-2.4.0-hive-hbase-Api.7z

    标题“spark-2.4.0-hive-hbase-Api.7z”表明这是一个与Apache Spark、Apache Hive和Apache HBase相关的压缩包文件,适用于版本2.4.0。这个压缩包很可能包含了这三个组件的API库,使得开发人员能够在集成环境中进行...

    spark-1.6.0-bin-hadoop2-without-hive.tgz

    hive2.1.0 --- spark1.6.0 hive on spark的spark包,这个是已经经过./make-distribution.sh --name "hadoop2-without-hive" --tgz "-Pyarn,hadoop-provided,hadoop-2.4,parquet-provided"编译后的了spark-1.6.0-bin-...

    spark-2.3.0-bin-hadoop277-without-hive.tgz

    spark2.3.0 without hive 编译版本,用于Hive on Spark 环境搭建 ./dev/make-distribution.sh --name "hadoop277-without-hive" --tgz "-Pyarn,hadoop-provided,hadoop-2.7,parquet-provided,orc-provided" -...

    spark-3.0.2-bin-hadoop2.7-hive1.2.tgz

    在运行Spark之前,需要根据你的集群环境调整`conf/spark-defaults.conf`和`conf/hive-site.xml`等配置文件,以确保与Hadoop和Hive的正确连接。 总的来说,Spark 3.0.2与Hadoop 2.7和Hive 1.2的集成为大数据处理提供...

    spark-3.2.4-bin-hadoop3.2-scala2.13 安装包

    Spark是Apache软件基金会下的一个大数据处理框架,以其高效、易用和可扩展性著称。在本安装包“spark-3.2.4-bin-hadoop3.2-scala2.13”中,包含了用于运行Spark的核心组件以及依赖的Hadoop版本和Scala编程语言支持。...

Global site tag (gtag.js) - Google Analytics