1. When I alter a talbe in a database, I can get the altered table information by hive cli and hwi, but error comes when click the table in Hue Metastore Manager .
A: Find no reasons,but fixed it by restart hiveserver2
2.When I run a pig script in Hue using HCatLoader, error emits
ERROR 1070: Could not resolve HCatLoader using imports: [, java.lang., org.apache.pig.builtin., org.apache.pig.impl.builtin.]
the pig srcipt likes:
--extract user basic info from inok_raw.inok_user table
user = load 'inok_raw.inok_user' using HCatLoader();
user1 = foreach user generate user_id, user_name, user_gender, live_city, birth_city;
store user1 into 'inok_datamine.inok_user' using HCatStorer();
see:http://mail-archives.apache.org/mod_mbox/incubator-hcatalog-user/201208.mbox/%3CCAP0y+ToQrexQTd8q7tYSdEJoceE5u-9x60ptR9Z0DRiL9ZVUVw@mail.gmail.com%3E
A:
a. copy all related jars in hive-0.12.0-bin/hcatalog/share/hcatalog to oozie-4.0.1/share/lib/pig and update sharelib in hdfs
This method solves the frist error,but produces another one:
------------------------------
To run pig with hcat in hue, following the steps shown below to prepare env.
a. backups all jars in share/lib/pig
b. delete share/lib/pig in hdfs
c. compile pig-0.12.0 to support hadoop2.3.0
d. copy all jars except hadoop*.jar in pig-0.12.0/build/ivy/lib/Pig to oozie-4.0.1/share/lib/pig
e. copy pig-0.12.0.jar pig-0.12.1-withouthadoop.jar to share/lib/pig
f. copy oozie-sharelib-pig-4.0.1.jar from backuped jars in share/lib/pig
g. copy all jars in hive-0.12.0-bin/hcatalog/share/hcatalog to share/lib/pig
h. copy all jars in hive-0.12.0-bin/lib to share/lib/pig
i. copy mysql jdbc driver to share/lib/pig
j. update sharelib in hdfs
but erros comes
#jar tf share/lib/pig/hcatalog-core-0.12.0.jar
this jar comes from hive-0.12.0-bin/hcatalog/share/hcatalog/hcatalog-core-0.12.0.jar
so, I guess the hive-0.12.0-bin is not target to hadoop2.3.0. so should recompile it.
see:
http://community.cloudera.com/t5/Cloudera-Manager-Installation/Upgraded-the-Hive-to-0-12-manually-and-I-can-t-run-the-sample/td-p/7055
http://stackoverflow.com/questions/22630323/hadoop-java-lang-incompatibleclasschangeerror-found-interface-org-apache-hadoo
https://issues.apache.org/jira/browse/HIVE-6729
org.apache.hive.hcatalog.pig.HCatStorer.setStoreLocation
public void setStoreLocation(String location, Job job) throws IOException {
....
Job clone = new Job(job.getConfiguration());
....
HCatSchema hcatTblSchema = HCatOutputFormat.getTableSchema(job);
.....
https://cwiki.apache.org/confluence/display/Hive/GettingStarted#GettingStarted-CompileHivePriorto0.13onHadoop23
https://cwiki.apache.org/confluence/display/Hive/AdminManual+Installation
https://cwiki.apache.org/confluence/display/Hive/HowToContribute
1.down source code
svn co http:
svn co http://svn.apache.org/repos/asf/hive/branches/branch-0.12 hive-0.12
svn co http://svn.apache.org/repos/asf/hive/branches/branch-0.13 hive-0.13
2.compile
hive-0.12 compile
ant clean package -Dhadoop.version=2.0.0-alpha -Dhadoop-0.23.version=2.0.0-alpha -Dhadoop.mr.rev=23
ant clean package -Dhadoop.version=2.3.0 -Dhadoop-0.23.version=2.3.0 -Dhadoop.mr.rev=23
faild due to some reasons
hive-0.13 compile
mvn clean install -DskipTests -Phadoop-2,dist
sucess.
The dist tar is in packaging/target. But there is no hwi.war and can not work with hue3.5.0 when create hive metastore database
I recompiled hive-0.12.0 to support hadoop2.3.0,but the error is still exist. When I use libs in compiled hive-0.13.0,the error is dismissed,but another error comes. Fuck!
---------------
ERROR 2998: Unhandled internal error. java.lang.NoSuchFieldError: METASTORETHRIFTCONNECTIONRETRIES
com.google.common.util.concurrent.ExecutionError: java.lang.NoSuchFieldError: METASTORETHRIFTCONNECTIONRETRIES
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2232)
at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4764)
at org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:216)
at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:192)
at org.apache.hive.hcatalog.common.HCatUtil.getHiveClient(HCatUtil.java:569)
at org.apache.hive.hcatalog.pig.PigHCatUtil.getHiveMetaClient(PigHCatUtil.java:159)
at org.apache.hive.hcatalog.pig.PigHCatUtil.getTable(PigHCatUtil.java:195)
at org.apache.hive.hcatalog.pig.HCatLoader.getSchema(HCatLoader.java:210)
When recrate hive metastore to schema version 0.13.0 . But hue can't support hive0.13.0 (could delete or create database in hive metastore). So we revert to hive0.12.0 but using schema version 0.13.0 and
configure the hive-site.xml the entity
<property>
<name>hive.metastore.schema.verification</name>
<value>false</value>
</property>
and use related jars from hive-0.13.0 in oozie/share/lib/pig
The main reason is that jars confilcts when copy jars from pig0.12.0/lib/. So the correct way is ignore the d step.
*********************************************
In order to integrate hive metastore/hcatalog with pig. We should do the following steps:
1.compile pig0.12.0 to support hadoop2.3.0
2.compile hive0.12.0 and hive0.13.0 to support hadoop2.3.0
3.use hive metastore schema version 0.13.0
4.use hive0.12.0 with hue
5.use hive0.13.0 related jars in oozie/share/lib/pig/
*********************************************
- 大小: 24.8 KB
- 大小: 16 KB
分享到:
相关推荐
< artifactId>hummingcat</ artifactId> < version>0.1.1</ version></ dependency>用法首先,要求项目: ( use 'hiccup.core) # Hiccup is for html generation( require '[hummingcat.lib :as hcat]) 现在我们已经...
2. 医疗投诉分析工具(HCAT)的重要性:HCAT是第一个医疗投诉的标准化分析工具,用于编纂和评估病人及其家属或投诉信中所反映的问题。 3. HCAT分类框架:HCAT分类框架包括基本信息、问题分类、严重程度、投诉问题...
npm install -g hcat echo '<hr>' | hcat 命令行参数 --port set a port for this hcat execution, defaults to 0 (random port) --hostname set the hostname for this hcat execution, defaults to localhost --...
using HopfieldNets include(Pkg.dir("HopfieldNets", "demo", "letters.jl")) patterns = hcat(X, O) n = size(patterns, 1) net = DiscreteHopfieldNet(n) train!(net, patterns) settle!(net, 10, true) ...
StackViews StackViews仅提供一种数组类型: StackView 。 有多种了解StackView : ...StackView工作原理与Base中的cat及其vcat / hcat / hvcat变体非常相似。 julia > using StackViews julia > A = re
- `hcat`:等同于`hadoop fs -cat`,用于查看文件内容。 - `htail`:等同于`hadoop fs -tail`,用于查看文件的最后部分。 4. **文件权限管理**: - `hchmod`:等同于`hadoop fs -chmod`,用于更改文件或目录的...
蓝牙模块HC-05AT指令集的知识点涵盖了该模块的配置、工作模式、角色设定、状态切换、通信参数设置以及软件版本查询等方面。HC-05AT模块作为一款常用的嵌入式蓝牙串口通讯模块,它的操作依赖于AT指令集来实现各种功能...
有一段时间没用sqoop了,今天...Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /opt/module/sqoop/bin/…/…/accumulo does not exist! Accumulo imports will fail. Please set $ACCU
using DataFrames ``` **1. 创建DataFrame** 创建DataFrame有多种方法。你可以从字典、数组、元组或其他DataFrame构造: ```julia # 从字典创建 df1 = DataFrame(A=1:5, B=6:10) # 从数组创建 df2 = DataFrame(...
该软件包支持数组操作的惰性类似物,例如vcat , hcat和乘法。 这有助于实现迭代求解器的无矩阵方法。 该软件包在设计时就考虑到了高性能,因此对于诸如copyto!类的许多操作,其性能应优于Base的非延迟类似物copyto...
HDFS,Hbase,具有HCat集成和Elastic Search的Hive 附加的: 用Ganglia进行监视 Nagios的心脏警报 这是一部分 跑步 要启动并运行虚拟机,请执行以下操作: git clone git://github.com/DemandCube/Scribengin cd...
" 符号通常用于表示“改变原数据”的操作,称为“in-place”操作。例如,`df[:, !col]` 表示选取 Dataframe `df` 中的所有行但不包括列 `col`,并且这种操作会直接修改原始 Dataframe。而在 Julia 的 Dataframes 中...
例如,Evolent Health (EVH US)、Health Catalyst (HCAT US) 和 Livongo (LVGO US) 等公司被给予了"Overweight"(超重/看好)评级,而Allscripts (MDRX US) 被评为"Neutral"(中性)。值得注意的是,一些公司的目标...
`#if [ -z "${HCAT_HOME}" ]; then` 六、检验查询库里面的所有表 使用 Sqoop 的 list-tables 命令可以查询库里面的所有表。例如,使用以下命令可以查询 MySQL 库里面的所有表: `sqoop list-tables --connect ...
对于像Tomcat、Ponder、Hcat、Nginx、SRSFTP等其他服务,也可以采用同样的方式创建相应的初始化脚本来实现服务化。 2. **创建符号链接** 在`/etc/rc3.d/`目录下,新建一个指向第1步中建立的服务脚本的符号链接...
此外,分配合适的审定机构(DOE)和独立第三方审核团队(HCAT)也至关重要,他们负责审查项目细节,确保其准确无误。 3. **项目核证(Verification)**: 核证是验证项目执行后是否按照预定的方式进行,以确保项目...
两个字符图像可以通过顶端对齐的方式经水平连接(HCat)形成一个新的字符图像;两个字符图像可以通过左端对齐的方式经垂直连接(VCat)形成一个新的字符图像; 1. 现给出了一个字符图像的类定义,如下 实现上边的...
- 数组操作:包括索引、切片、拼接、转置等,如`A[1:3]`选取数组的一部分,`hcat(A,B)`水平拼接两个数组。 2. **字典(Dictionaries)** - 字典是一种键值对存储结构,如`Dict{String,Int}`。通过键来查找对应的...
在配置环境变量时,需要修改hadoop用户的.bashrc文件,设置HADOOPROOT、HADOOP_HOME、ZOOKEEPER_HOME、HBASE_HOME、HIVE_HOME、HCAT_HOME、KYLIN_HOME和CATALINA_HOME等环境变量,并将它们加入到PATH中。这些环境...