HBase(2)JAVA Client
First of all, how to run jar based on maven plugin.
pom.xml
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>com.sillycat.easyhbase.ExecutorApp</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
</plugins>
</build>
And the ExecutorApp is just a simplest Java Application.
package com.sillycat.easyhbase;
public class ExecutorApp {
public static void main(String[] args) {
System.out.println("Nice try.");
}
}
Fire and build the jar
>mvn clean install
Run the jar
>java -jar target/easyhbase-1.0.jar
Nice try.
Error Message
08-05 17:15:46 [WARN] org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:106) - Failed to identify the fs of dir hdfs://ubuntu-master:9000/hbase/lib, ignored java.io.IOException: No FileSystem for scheme: hdfs at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2385) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2392) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296) at org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:104) at org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:202)
Solution:
After add this jar, the problem solved.
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>${hadoop.version}</version>
</dependency>
Error Message
An internal error occurred during: "Updating Maven Project".
Unsupported IClasspathEntry kind=4
Solution:
Because I am using eclipse, so I use command mvn eclipse:eclipse to generate the jar dependencies.
Right click on the project [Properties] —> [Java Build Path] —>[Library]
Remove all the blue entries starting with “M2_REPO”, then I can use Maven—>Update Project again. Even I can download the source codes.
All the codes are in project easyhbase.
The dependencies are as follow:
<properties>
<hadoop.version>2.4.1</hadoop.version>
<hbase.version>0.98.4-hadoop2</hbase.version>
</properties>
<dependencies>
<dependency>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
<version>1.1.3</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-hadoop2-compat</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-common</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>${hbase.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
</dependencies>
The JAVA Client Sample Codes
package com.sillycat.easyhbase;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.Cell;
import org.apache.hadoop.hbase.CellUtil;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.HColumnDescriptor;
import org.apache.hadoop.hbase.HTableDescriptor;
import org.apache.hadoop.hbase.MasterNotRunningException;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.ZooKeeperConnectionException;
import org.apache.hadoop.hbase.client.Delete;
import org.apache.hadoop.hbase.client.Get;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.hadoop.hbase.client.HTable;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.client.Result;
import org.apache.hadoop.hbase.client.ResultScanner;
import org.apache.hadoop.hbase.client.Scan;
import org.apache.hadoop.hbase.util.Bytes;
public class HBaseMain {
private static Configuration conf = null;
// setting
static {
conf = HBaseConfiguration.create();
}
// create table
public static void creatTable(String tableName, String[] familys) throws Exception {
HBaseAdmin admin = new HBaseAdmin(conf);
if (admin.tableExists(tableName)) {
System.out.println("table already exists!");
} else {
TableName tableNameObject = TableName.valueOf(tableName);
HTableDescriptor tableDesc = new HTableDescriptor(tableNameObject);
for (inti = 0; i < familys.length; i++) {
tableDesc.addFamily(new HColumnDescriptor(familys[i]));
}
admin.createTable(tableDesc);
System.out.println("create table " + tableName + " ok.");
}
}
// delete table
public static void deleteTable(String tableName) throws Exception {
try {
HBaseAdmin admin = new HBaseAdmin(conf);
admin.disableTable(tableName);
admin.deleteTable(tableName);
System.out.println("delete table " + tableName + " ok.");
} catch (MasterNotRunningException e) {
e.printStackTrace();
} catch (ZooKeeperConnectionException e) {
e.printStackTrace();
}
}
// insert one line
public static void addRecord(String tableName, String rowKey, String family, String qualifier, String value) throws Exception {
try {
HTable table = new HTable(conf, tableName);
Put put = new Put(Bytes.toBytes(rowKey));
put.add(Bytes.toBytes(family), Bytes.toBytes(qualifier), Bytes.toBytes(value));
table.put(put);
System.out.println("insert recored " + rowKey + " to table " + tableName + " ok.");
} catch (IOException e) {
e.printStackTrace();
}
}
// delete one line
public static void delRecord(String tableName, String rowKey) throws IOException {
HTable table = new HTable(conf, tableName);
List list = new ArrayList();
Delete del = new Delete(rowKey.getBytes());
list.add(del);
table.delete(list);
System.out.println("del recored " + rowKey + " ok.");
}
// query for one line
public static void getOneRecord(String tableName, String rowKey) throws IOException {
HTable table = new HTable(conf, tableName);
Get get = new Get(rowKey.getBytes());
Result rs = table.get(get);
for (Cell cell : rs.rawCells()) {
System.out.print(new String(CellUtil.cloneRow(cell)) + " ");
System.out.print(new String(CellUtil.cloneFamily(cell)) + ":");
System.out.print(new String(CellUtil.cloneQualifier(cell)) + " ");
System.out.print(cell.getTimestamp() + " ");
System.out.println(new String(CellUtil.cloneValue(cell))); }
}
// list all data
public static void getAllRecord(String tableName) {
try {
HTable table = new HTable(conf, tableName);
Scan s = new Scan();
ResultScanner ss = table.getScanner(s);
for (Result r : ss) {
for (Cell cell : r.rawCells()) {
System.out.print(new String(CellUtil.cloneRow(cell)) + " ");
System.out.print(new String(CellUtil.cloneFamily(cell)) + ":");
System.out.print(new String(CellUtil.cloneQualifier(cell)) + " ");
System.out.print(cell.getTimestamp() + " ");
System.out.println(new String(CellUtil.cloneValue(cell)));
}
}
} catch (IOException e) {
e.printStackTrace();
}
}
public static void getRangeRecord(String tableName,String startRowKey,String endRowKey){
try {
HTable table = new HTable(conf, tableName);
Scan s = new Scan(startRowKey.getBytes(),endRowKey.getBytes());
ResultScanner ss = table.getScanner(s);
for (Result r : ss) {
for (Cell cell : r.rawCells()) {
System.out.print(new String(CellUtil.cloneRow(cell)) + " ");
System.out.print(new String(CellUtil.cloneFamily(cell)) + ":");
System.out.print(new String(CellUtil.cloneQualifier(cell)) + " ");
System.out.print(cell.getTimestamp() + " ");
System.out.println(new String(CellUtil.cloneValue(cell)));
}
}
} catch (IOException e) {
e.printStackTrace();
}
}
public static void main(String[] args) {
try {
String tablename = "scores"; String[] familys = { "grade", "course" };
HBaseMain.creatTable(tablename, familys);
// add record sillycat
HBaseMain.addRecord(tablename, "sillycat-20140723", "grade", "", "5");
HBaseMain.addRecord(tablename, "sillycat-20140723", "course", "math", "97");
HBaseMain.addRecord(tablename, "sillycat-20140723", "course", "art", "87");
HBaseMain.addRecord(tablename, "sillycat-20130723", "grade", "", "5");
HBaseMain.addRecord(tablename, "sillycat-20130723", "course", "math", "97");
HBaseMain.addRecord(tablename, "sillycat-20130723", "course", "art", "87");
HBaseMain.addRecord(tablename, "sillycat-20120723", "grade", "", "5");
HBaseMain.addRecord(tablename, "sillycat-20120723", "course", "math", "97");
HBaseMain.addRecord(tablename, "sillycat-20120723", "course", "art", "87");
// add record kiko
HBaseMain.addRecord(tablename, "kiko-20140723", "grade", "", "4");
HBaseMain.addRecord(tablename, "kiko-20140723", "course", "math", "89");
System.out.println("===========get one record========");
HBaseMain.getOneRecord(tablename, "sillycat");
System.out.println("===========show all record========");
HBaseMain.getAllRecord(tablename);
System.out.println("===========del one record========");
HBaseMain.delRecord(tablename, "kiko");
HBaseMain.getAllRecord(tablename);
System.out.println("===========show all record========");
HBaseMain.getAllRecord(tablename);
System.out.print("=============show range record=======");
HBaseMain.getRangeRecord(tablename, "sillycat-20130101", "sillycat-20141231");
} catch (Exception e) {
e.printStackTrace();
}
}
}
References:
http://www.cnblogs.com/panfeng412/archive/2011/08/14/hbase-java-client-programming.html
http://lirenjuan.iteye.com/blog/1470645
http://hbase.apache.org/book/hbase_apis.html
http://blog.linezing.com/?tag=hbase
http://f.dataguru.cn/thread-226503-1-1.html
http://www.cnblogs.com/ggjucheng/p/3379459.html
http://blog.sina.com.cn/s/blog_ae33b83901016azb.html
http://blog.nosqlfan.com/html/3694.html
http://www.infoq.com/cn/news/2011/07/taobao-linhao-hbase
http://blog.csdn.net/xunianchong/article/details/8995019
client performance
http://blog.linezing.com/?p=1378
design row key
http://san-yun.iteye.com/blog/1995829
相关推荐
本主题将深入探讨如何使用Java客户端API与HBase进行交互,包括集成Spring、MapReduce实例以及协处理器的使用。 首先,让我们从HBase的基础开始。HBase是构建在Hadoop文件系统(HDFS)之上的开源NoSQL数据库,它为非...
这通常通过`HBaseConfiguration.create()`方法创建一个配置对象,然后设置相关配置,如Zookeeper地址(`HBASE_ZOOKEEPER_QUORUM`),端口(`HBASE_ZOOKEEPER_CLIENT_PORT`)等。接着,使用`ConnectionFactory.create...
标签:apache、client、hbase、jar包、java、API文档、中文版; 使用方法:解压翻译后的API文档,用浏览器打开“index.html”文件,即可纵览文档内容。 人性化翻译,文档中的代码和结构保持不变,注释和说明精准翻译...
在这个“hbase-java开发连接工具类”中,包含了一个1.2.1版本的`hbase-client.jar`,这是与HBase通信的核心库。 `hbase-client.jar`包含了以下关键组件和功能: 1. **HBase客户端API**:Java开发者可以通过这个API...
在Java API中,我们主要通过`org.apache.hadoop.hbase.client.Connection`和`org.apache.hadoop.hbase.client.Table`这两个核心类来进行交互。 1. **连接HBase**: 使用`ConnectionFactory.createConnection...
标签:apache、hbase、client、中文文档、jar包、java; 使用方法:解压翻译后的API文档,用浏览器打开“index.html”文件,即可纵览文档内容。 人性化翻译,文档中的代码和结构保持不变,注释和说明精准翻译,请...
标签:apache、client、hbase、jar包、java、API文档、中文版; 使用方法:解压翻译后的API文档,用浏览器打开“index.html”文件,即可纵览文档内容。 人性化翻译,文档中的代码和结构保持不变,注释和说明精准翻译...
本文将详细介绍如何使用Java代码实现这一过程,包括样例MySQL表和数据,以及HBase表的创建。 首先,我们需要了解MySQL和HBase的基本概念。MySQL是一种关系型数据库管理系统,它基于ACID(原子性、一致性、隔离性和...
- **hbase-client.jar**:包含了HBase的客户端API,是我们进行Java开发的主要依赖。 - **hbase-common.jar**:包含HBase通用的类和工具,如数据模型、配置等。 - **hbase-server.jar**:虽然不是所有场景都需要,...
1. **连接HBase**:在Java代码中,我们使用`HBaseConfiguration.create()`来创建一个配置对象,然后可以设置各种配置参数,如Zookeeper的地址(`HBASE_ZOOKEEPER_QUORUM`),端口(`HBASE_ZOOKEEPER_CLIENT_PORT`)...
在Java编程环境中,将本地文件读取并上传到HBase是一项常见的任务,特别是在大数据处理和存储的场景下。HBase是一个分布式、版本化的NoSQL数据库,基于Apache Hadoop,适用于大规模数据存储。以下是一个详细的过程,...
当前状态:完全通过 HBase 0.94 和 0.94.16Java hbase-client支持 HBase 服务器的版本[√] 0.94.x[√] 0.94.0[√] 0.94.160.95.x0.96.x安装$ npm install hbase-client使用 CRUD:通过 zookeeper 创建 HBase ...
HBaseConfiguration 可以设置 ZooKeeper 的配置信息,如 hbase.zookeeper.quorum 和 hbase.zookeeper.property.clientPort。 创建表是通过 HBaseAdmin 对象来操作的。HBaseAdmin 负责表的 META 信息处理,提供了 ...
hbase phoenix 客户端连接jdbc的jar包,SQuirreL SQL Client,DbVisualizer 等客户端连接hbase配置使用
Phoenix使用Java存根(stub)和服务器端的元数据服务来处理SQL到HBase的映射,从而减轻了客户端的负担。 在Squirrel SQL Client中,要使用"phoenix-5.0.0-HBase-2.0-client.jar",你需要首先将其添加到Squirrel的驱动...
在Java代码中,首先需要导入必要的库,如`org.apache.hadoop.hbase.HBaseConfiguration`和`org.apache.hadoop.hbase.client.Connection`。然后,我们需要创建一个HBase的配置对象,设置Zookeeper的地址,这是HBase...
在Java编程环境中,操作HBase并将其数据写入HDFS(Hadoop Distributed File System)是一项常见的任务,特别是在大数据处理和分析的场景下。本篇将详细介绍如何使用Java API实现这一功能,以及涉及到的关键技术和...
Java访问HBase所需的Jar包是实现Java应用程序与HBase交互的关键。HBase作为一个基于Hadoop文件系统的分布式列式存储数据库,其设计目标是处理大规模的数据并提供亚秒级的访问速度。为了在Java环境中顺利地操作HBase...
在Java编程环境中,链接并操作HBase是一种常见的任务,特别是在大数据处理和分布式存储的应用场景下。HBase是一个基于Google Bigtable设计的开源NoSQL数据库,它运行在Hadoop之上,提供高并发、低延迟的数据存储服务...
在Java编程环境中,HBase是一种分布式、高性能的NoSQL数据库,常用于大数据处理。本示例代码主要展示了如何使用Java API连接HBase数据库,并执行基本的CRUD(创建、读取、更新、删除)操作,同时也包括了批量操作的...