- 浏览: 55868 次
- 性别:
- 来自: 广州
最新评论
文章列表
SQL context available as sqlContext.
scala> var myVar : String = "Foo"
myVar: String = Foo
scala> val myVal : String = "Foo"
myVal: String = Foo
scala> var myVar : String = "Foo1"
myVar: String = Foo1
scala> myVal="aa"
<co ...
kerberos 5
- 博客分类:
- hadoop
kadmin.local: listprincs
K/M@JACK
kadmin/admin@JACK
kadmin/changepw@JACK
kadmin/hadoopmaster.jack@JACK
krbtgt/JACK@JACK
kadmin.local: addprinc admin/admin@JACK
WARNING: no policy specified for admin/admin@JACK; defaulting to no policy
Enter password for principal "admin/admin@JACK":
Re-e ...
hadoop network
- 博客分类:
- hadoop
[root@hadoopmaster ~]# cat /etc/hosts
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
192.168.1.50 hadoopmaster.jack hadoopmaster
192.168.1.51 hadoopslave1.jack hadoopslave1
192.168.1.52 ...
[hadoop@hadoopmaster test]$ hadoop distcp hdfs://hadoopmaster:9000/user/hive/warehouse/jacktest.db hdfs://hadoopmaster:9000/jacktest/todir
15/11/18 05:39:30 INFO tools.DistCp: Input Options: DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false, ignoreFailures=false, maxMaps=20, ss ...
http://www.tuicool.com/articles/ZfeQrq7
http://spark.apache.org/docs/1.3.1/programming-guide.html
http://www.imobilebbs.com/wordpress/archives/4984?variant=zh-sg
hadoop常用命令
- 博客分类:
- hadoop
namenode(hdfs)+jobtracker(mapreduce)可以放在一台机器上,datanode+tasktracker可以在一台机器上,辅助namenode要单独放一台机器,jobtracker通常情况下分区跟datanode一样(目录最好分布在不同的磁盘上,一个目录对应一个磁盘),namenode存储 ...
awk -F ',' '{for(i=1;i<=NF;i++){gsub(/^"/,"",$i);gsub(/"$/,"",$i);}{ print $0}}' t.csv
awk -F ',' '{gsub(/^"/,"",$1);gsub(/"$/,"",$1);gsub(/^"/,"",$2);gsub(/"$/,"",$2); { print $0} }' t.csv
gawk -v FIELDWI ...
log4j-1.2.16
mybatis-3.2.3
mybatis-generator-core-1.3.2
mysql-connector-java-5.1.28-bin
ojdbc14
<bean
class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="systemPropertiesModeName"
value="SYSTEM_PROPERTIES_MODE_OVERR ...
publci class Singleton{
private static Singleton instance = null;
private Singleton(){}
public static synchronized Singleton getInstance(){
if(instance == null){
instance = new Singleton();
}
return instance;
}
}
ps ...
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
public class TestHive {
// hive --service hiveserver2 --hiveconf hive.server2.thrift.port=10001
// if you want to change a port to start the hive2, then ad ...
stackoverflow
Following are the three commands which appears same but have minute differences
hadoop fs {args}
hadoop dfs {args}
hdfs dfs {args}
hadoop fs <args>
FS relates to a generic file system which can point to any file systems like local, HDFS etc. So this can be used when you are dea ...
continues integration tools
team city
http://blog.csdn.net/huwei2003/article/details/42875815
读取文件名,统计文件数量
- 博客分类:
- linux
[root@linuxstudy sh]# cat ./s1.sh
#!/bin/bash
filelist=$(ls ~/)
echo "-----------begin------------"
for file in $filelist
do
echo $file
if [ -f ~/$file ]
then echo "this is a file"
else echo "this is a dir"
fi
done
echo "-----------en ...
pig call hcatalog
- 博客分类:
- hadoop
[hadoop@hadoopmaster ~]$ pig pig3.pig
15/08/30 01:34:26 INFO pig.ExecTypeProvider: Trying ExecType : LOCAL
15/08/30 01:34:26 INFO pig.ExecTypeProvider: Trying ExecType : MAPREDUCE
15/08/30 01:34:26 INFO pig.ExecTypeProvider: Picked MAPREDUCE as the ExecType
2015-08-30 01:34:26,086 [main] INFO org.a ...
hadoop fsck
- 博客分类:
- hadoop
hadoop fsck
Usage: DFSck <path> [-move | -delete | -openforwrite] [-files [-blocks [-locations | -racks]]]
<path> 检查这个目录中的文件是否完整
-move 破损的文件移至/lost+found目录
-delete 删除破损的文件
-openforwrite 打印正在打开写操作的文件
-files ...