`

hive 三种启动方式及用途,关注通过jdbc连接的启动

 
阅读更多

 1, hive  命令行模式,直接输入/hive/bin/hive的执行程序,或者输入 hive –service cli

       用于linux平台命令行查询,查询语句基本跟mysql查询语句类似

 2, hive  web界面的启动方式,hive –service hwi  

       用于通过浏览器来访问hive,感觉没多大用途

 3, hive  远程服务 (端口号10000) 启动方式,./hive --service hiveserver >/dev/null 2>/dev/null &

 

       用java等程序实现通过jdbc等驱动的访问hive就用这种起动方式了,这个是程序员最需要的方式了

 

 

hive配置远程metastore

 

hive配置远程metastore的方法:

    1)首先配置hive使用本地mysql存储metastore(服务器A 111.121.21.23)(也可以使用远程mysql存储)

    2)配置完成后,在服务器A启动服务:bin/hive --service metastore  (默认监听端口是:9083)

    3)配置hive客户端,修改hive-site.xml:(服务器B-需要有hadoop环境)

        <property>

          <name>hive.metastore.local</name>

          <value>false</value>

          <description>controls whether to connect to remove metastore server or open a new metastore server in Hive Client JVM</description>

        </property>

        

        <property>

          <name>hive.metastore.uris</name>

          <value>thrift://111.121.21.23:9083</value>

          <description></description>

        </property>

     4)运行:bin/hive ,执行测试hql

 

     5)hive客户端连接成功后,hive服务器端输出以下日志:

Starting Hive Metastore Server

11/10/31 18:07:27 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore

11/10/31 18:07:27 INFO metastore.ObjectStore: ObjectStore, initialize called

11/10/31 18:07:27 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.

11/10/31 18:07:27 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.

11/10/31 18:07:27 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.

11/10/31 18:07:27 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored

11/10/31 18:07:27 INFO DataNucleus.Persistence: Property javax.jdo.option.NonTransactionalRead unknown - will be ignored

11/10/31 18:07:27 INFO DataNucleus.Persistence: ================= Persistence Configuration ===============

11/10/31 18:07:27 INFO DataNucleus.Persistence: DataNucleus Persistence Factory - Vendor: "DataNucleus"  Version: "2.0.3"

11/10/31 18:07:27 INFO DataNucleus.Persistence: DataNucleus Persistence Factory initialised for datastore URL="jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true" driver="com.mysql.jdbc.Driver" userName="hive"

11/10/31 18:07:27 INFO DataNucleus.Persistence: ===========================================================

11/10/31 18:07:28 INFO Datastore.Schema: Creating table `DELETEME1320055648261`

11/10/31 18:07:28 INFO Datastore.Schema: Schema Name could not be determined for this datastore

11/10/31 18:07:28 INFO Datastore.Schema: Dropping table `DELETEME1320055648261`

11/10/31 18:07:28 INFO Datastore.Schema: Initialising Catalog "hive", Schema "" using "None" auto-start option

11/10/31 18:07:28 INFO Datastore.Schema: Catalog "hive", Schema "" initialised - managing 0 classes

11/10/31 18:07:28 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"

11/10/31 18:07:28 INFO DataNucleus.MetaData: Registering listener for metadata initialisation

11/10/31 18:07:28 INFO metastore.ObjectStore: Initialized ObjectStore

11/10/31 18:07:28 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/modules/hive/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo" at line 11, column 6 : cvc-elt.1: Cannot find the declaration of element 'jdo'. - Please check your specification of DTD and the validity of the MetaData XML that you have specified.

11/10/31 18:07:28 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/modules/hive/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo" at line 312, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.

11/10/31 18:07:28 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/modules/hive/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo" at line 359, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.

11/10/31 18:07:28 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/modules/hive/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo" at line 381, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.

11/10/31 18:07:28 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/modules/hive/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo" at line 416, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.

11/10/31 18:07:28 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/modules/hive/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo" at line 453, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.

11/10/31 18:07:28 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/modules/hive/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo" at line 494, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.

11/10/31 18:07:28 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/modules/hive/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo" at line 535, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.

11/10/31 18:07:28 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/modules/hive/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo" at line 576, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.

11/10/31 18:07:28 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/modules/hive/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo" at line 621, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.

11/10/31 18:07:28 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/modules/hive/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo" at line 666, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.

11/10/31 18:07:29 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : `DBS`, InheritanceStrategy : new-table]

11/10/31 18:07:29 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MDatabase.parameters [Table : `DATABASE_PARAMS`]

11/10/31 18:07:29 INFO Datastore.Schema: Validating 2 index(es) for table `DBS`

11/10/31 18:07:29 INFO Datastore.Schema: Validating 0 foreign key(s) for table `DBS`

11/10/31 18:07:29 INFO Datastore.Schema: Validating 2 unique key(s) for table `DBS`

11/10/31 18:07:29 INFO Datastore.Schema: Validating 2 index(es) for table `DATABASE_PARAMS`

11/10/31 18:07:29 INFO Datastore.Schema: Validating 1 foreign key(s) for table `DATABASE_PARAMS`

11/10/31 18:07:29 INFO Datastore.Schema: Validating 1 unique key(s) for table `DATABASE_PARAMS`

11/10/31 18:07:29 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MDatabase

11/10/31 18:07:29 INFO metastore.HiveMetaStore: Started the new metaserver on port [9083]...

11/10/31 18:07:29 INFO metastore.HiveMetaStore: Options.minWorkerThreads = 200

11/10/31 18:07:29 INFO metastore.HiveMetaStore: Options.maxWorkerThreads = 100000

11/10/31 18:07:29 INFO metastore.HiveMetaStore: TCP keepalive = true

11/10/31 18:09:10 INFO metastore.HiveMetaStore: 1: get_all_databases

11/10/31 18:09:10 INFO HiveMetaStore.audit: ugi=rsync   ip=/111.121.23.21      cmd=get_all_databases

11/10/31 18:09:10 INFO metastore.HiveMetaStore: 1: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore

11/10/31 18:09:10 INFO metastore.ObjectStore: ObjectStore, initialize called

11/10/31 18:09:10 INFO metastore.ObjectStore: Initialized ObjectStore

11/10/31 18:09:14 INFO metastore.HiveMetaStore: 1: get_database: vv

11/10/31 18:09:14 INFO HiveMetaStore.audit: ugi=rsync   ip=/111.121.23.21     cmd=get_database: vv

11/10/31 18:09:14 INFO metastore.HiveMetaStore: 1: get_database: vv

11/10/31 18:09:14 INFO HiveMetaStore.audit: ugi=rsync   ip=/111.121.23.21      cmd=get_database: vv

11/10/31 18:09:24 INFO metastore.HiveMetaStore: 1: get_table : db=vv tbl=vv_20111031

11/10/31 18:09:24 INFO HiveMetaStore.audit: ugi=rsync   ip=/111.121.23.21      cmd=get_table : db=vv tbl=vv_20111031

11/10/31 18:09:24 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.

11/10/31 18:09:24 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MSerDeInfo [Table : `SERDES`, InheritanceStrategy : new-table]

11/10/31 18:09:24 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.

11/10/31 18:09:24 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MStorageDescriptor [Table : `SDS`, InheritanceStrategy : new-table]

11/10/31 18:09:24 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MTable [Table : `TBLS`, InheritanceStrategy : new-table]

11/10/31 18:09:24 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MSerDeInfo.parameters [Table : `SERDE_PARAMS`]

11/10/31 18:09:24 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MTable.parameters [Table : `TABLE_PARAMS`]

11/10/31 18:09:24 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MTable.partitionKeys [Table : `PARTITION_KEYS`]

11/10/31 18:09:24 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.bucketCols [Table : `BUCKETING_COLS`]

11/10/31 18:09:24 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.cols [Table : `COLUMNS`]

11/10/31 18:09:24 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.parameters [Table : `SD_PARAMS`]

11/10/31 18:09:24 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.sortCols [Table : `SORT_COLS`]

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 index(es) for table `SERDES`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 0 foreign key(s) for table `SERDES`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 unique key(s) for table `SERDES`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 4 index(es) for table `TBLS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 2 foreign key(s) for table `TBLS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 2 unique key(s) for table `TBLS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 2 index(es) for table `SDS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SDS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 unique key(s) for table `SDS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 2 index(es) for table `COLUMNS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 foreign key(s) for table `COLUMNS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 unique key(s) for table `COLUMNS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 2 index(es) for table `SERDE_PARAMS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SERDE_PARAMS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 unique key(s) for table `SERDE_PARAMS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 2 index(es) for table `SD_PARAMS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SD_PARAMS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 unique key(s) for table `SD_PARAMS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 2 index(es) for table `BUCKETING_COLS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 foreign key(s) for table `BUCKETING_COLS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 unique key(s) for table `BUCKETING_COLS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 2 index(es) for table `PARTITION_KEYS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 foreign key(s) for table `PARTITION_KEYS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 unique key(s) for table `PARTITION_KEYS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 2 index(es) for table `TABLE_PARAMS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 foreign key(s) for table `TABLE_PARAMS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 unique key(s) for table `TABLE_PARAMS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 2 index(es) for table `SORT_COLS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SORT_COLS`

11/10/31 18:09:24 INFO Datastore.Schema: Validating 1 unique key(s) for table `SORT_COLS`

11/10/31 18:09:24 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MSerDeInfo

11/10/31 18:09:24 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MStorageDescriptor

11/10/31 18:09:24 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MTable

11/10/31 18:09:24 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MFieldSchema

11/10/31 18:10:10 INFO metastore.HiveMetaStore: 2: get_database: vv

11/10/31 18:10:10 INFO HiveMetaStore.audit: ugi=rsync   ip=/111.121.23.21     cmd=get_database: vv

11/10/31 18:10:10 INFO metastore.HiveMetaStore: 2: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore

11/10/31 18:10:10 INFO metastore.ObjectStore: ObjectStore, initialize called

11/10/31 18:10:10 INFO metastore.ObjectStore: Initialized ObjectStore

11/10/31 18:10:10 INFO metastore.HiveMetaStore: 2: get_database: vv

11/10/31 18:10:10 INFO HiveMetaStore.audit: ugi=rsync   ip=/111.121.23.21      cmd=get_database: vv

11/10/31 18:10:16 INFO metastore.HiveMetaStore: 2: get_table : db=vv tbl=vv_20111031

 

11/10/31 18:10:16 INFO HiveMetaStore.audit: ugi=rsync   ip=/111.121.23.21      cmd=get_table : db=vv tbl=vv_20111031

分享到:
评论

相关推荐

    hive 三种启动方式及用途

    这篇博客主要探讨了Hive的三种启动方式及其用途,这对于我们理解如何有效管理和操作Hive系统至关重要。 1. **独立模式(Standalone Mode)** 在独立模式下,Hive运行在本地模式,不与任何Hadoop集群交互。这种模式...

    jdbc 连接impala或者jdbc连接hive

    在这个场景下,JDBC提供了连接Hive和Impala的功能,使得开发者可以通过编写Java程序或使用支持JDBC的任何其他工具来执行查询和操作数据。下面将详细介绍如何使用JDBC连接Hive和Impala。 1. **JDBC连接Hive** Hive...

    JDBC连接Hive实例

    通过实现这些接口,数据库厂商可以创建符合JDBC规范的驱动程序,使得Java开发者可以使用统一的方式来连接不同的数据库系统。Hive也提供了符合JDBC标准的驱动,使得我们可以使用JDBC来操作Hive。 连接Hive主要涉及...

    jdbc 通过impala 连接hive库

    本篇文章将详细讲解如何通过Impala JDBC连接到Hive库,并介绍相关的POM配置。 首先,让我们理解JDBC。JDBC是Java中用于与各种数据库交互的一套标准API,它允许Java开发者使用SQL语句来操作数据库。对于Hive和Impala...

    jdbc连接hive数据库的驱动jar包

    标题中的"jdbc连接hive数据库的驱动jar包"指的是Hive JDBC驱动的Java类库文件,通常是一个.jar文件,包含了所有必要的类和方法,使得Java应用程序能够建立到Hive服务器的连接,执行查询并获取结果。这个“jar包”...

    hive jdbc 连接所需要的jar包

    Hive JDBC连接是Java应用程序与Hive数据仓库进行交互的一种方式。它允许程序通过标准的JDBC接口查询和操作Hive中的数据。在使用Hive JDBC进行连接时,需要依赖一系列的jar包来确保所有必要的功能得以实现。下面将...

    hive-jdbc hive jdbc驱动

    hive-jdbc

    Hive_JDBC.zip_hive java_hive jdbc_hive jdbc pom_java hive_maven连

    Hive JDBC(Java Database Connectivity)是Hive提供的一种接口,使得其他编程语言,如Java,能够通过JDBC驱动程序与Hive进行交互,实现数据查询、分析和操作。本教程将深入讲解如何使用Hive JDBC在Java项目中建立与...

    hive jdbc连接方式加密

    hive权限,通过自定义jar对hive的10000端口进行权限管控,直接放入到hive所在的lib环境下,然后对xml文件进行相应的配置

    Java通过JDBC连接Hive

    压缩包中包含Java通过JDBC连接Hive所需的jar包。包含有,antlr-runtime-3.0.1.jar;commons-logging-1.0.4.jar;hadoop-core-1.2.1.jar;hive-exec-0.10.0.jar;hive-jdbc-0.13.1.jar;hive-metastore-0.9.0.jar;hive-...

    hive的jdbc连接驱动包

    hive的jdbc连接,亲身试验可以使用,但有时还需要hive其他环境

    apache seatunnel支持hive jdbc

    apache seatunnel支持hive jdbc

    hive建立JDBC连接需要的驱动jar包

    当需要通过编程方式访问Hive数据库时,就需要使用Hive的JDBC驱动。本文将详细介绍如何使用Hive JDBC驱动来建立与Hive服务器的连接。 首先,Hive JDBC驱动是Java程序连接Hive服务的关键组件,它实现了JDBC接口,使得...

    jdbc连接hive

    通过Java Database Connectivity (JDBC)接口,我们可以从Java程序或者其他支持JDBC的语言(如Python的JDBC驱动)中连接到Hive,进行数据操作。下面将详细介绍如何使用JDBC连接CDH搭建的Hive环境。 1. **JDBC介绍** ...

    java jdbc连接hive所需要所有jar包,请测有效。

    Java JDBC(Java Database Connectivity)是Java编程语言中与数据库交互的一种标准接口,它允许Java程序通过JDBC API连接并操作各种类型的数据库系统。Hive是一个基于Hadoop的数据仓库工具,可以将结构化的数据文件...

    jdbc连接hive的jar包

    标题 "jdbc连接hive的jar包" 涉及的关键知识点主要集中在Hive的JDBC连接以及相关依赖的Java库上。Hive是一个基于Hadoop的数据仓库工具,它允许通过SQL查询语言(HQL)访问存储在Hadoop分布式文件系统(HDFS)中的...

    hive-jdbc所需jar(精简可用)

    而Hive JDBC(Java Database Connectivity)是Hive与外部应用交互的重要接口,它允许Java应用程序或者其他支持JDBC的工具连接到Hive服务器,执行Hive查询并获取结果。 标题"hive-jdbc所需jar(精简可用)"表明这个...

    JDBC连接hive的jar

    标题中的“JDBC连接hive的jar”指的是Java应用程序通过Java Database Connectivity (JDBC) API来与Hadoop生态中的Hive数据仓库进行交互时所需的特定jar文件。Hive提供了一个基于SQL的接口,允许开发者以结构化的方式...

    连接hive依赖的jar包_hive连接方式

    Hive提供了JDBC驱动,使得我们可以通过Java应用程序或IDE(如DataGrip)连接到Hive服务。Hive的JDBC驱动通常包含在名为`hive-jdbc`的JAR包中,这个包包含了与Hive通信所需的全部类和接口。 2. **Hive连接方式** ...

    Hive-jdbc-3.1.1.zip

    总之,Hive JDBC 3.1.1是大数据开发中的一个重要组件,它使得Hive能够无缝集成到各种Java应用程序中,提供了一种灵活且高效的处理大数据的方式。对于需要处理PB级别数据的项目,Hive JDBC是不可或缺的工具。

Global site tag (gtag.js) - Google Analytics