上一篇文章已经安装完scribe,下面我们用java端,通过log4j 把日志写入scribe 日志系统。
一、生成scribe客户端
A. 修改配置文件scribe.thrift
cd /usr/local/scribeInstall/scribe/if
vi scribe.thrift
修改scribe.thrift文件: 把 include "fb303/if/fb303.thrift" 改成
include "[thrift解压路径]/thrift-0.5.0/contrib/fb303/if/fb303.thrift"
B. 生成 java客户端api
运行命令 thrift --gen java scribe.thrift
运行后会生成一个‘gen-java’的文件夹.里面会有3个java类,封装了所有java客户端发送log所需要的api。
A. 如果之前没设置ANT_HOME 和 PATH 请先设置这两个环境变量
export ANT_HOME=/usr/local/apache-ant-1.8.0
export PATH=$PATH:$ANT_HOME/bin
B. 生成libthrift.jar
cd /usr/local/scribeInstall/thrift-0.2.0/lib/java
ant
(如果没有错误在本文夹夹下会生成libthrift.jar)
C. 生成libfb303.jar
cd /usr/local/scribeInstall/thrift-0.2.0/contrib/fb303/java
ant
ant 执行成功后 libfb303.jar 会出现在/usr/local/scribeInstall/contrib/fb303/java/build/lib下
二、创建项目,运行测试
A.在eclipse 创建普通java项目
B.在项目中导入以下jar
C.在项目中添加gen-java 文件夹里的三个java类。
编写log4j 的scribe appender
AsyncScribeAppender.java:
package com.logtest;
import org.apache.log4j.AsyncAppender;
/**
* log4j 的scribe appender
* 用ScribeAppender 类连接scribe服务器,并把日志写如scribe
* @author ninja
*/
public class AsyncScribeAppender extends AsyncAppender {
private String hostname;
private String scribeHost;
private int scribePort;
private String scribeCategory;
private String encoading;
public String getHostname() {
return hostname;
}
public void setHostname(String hostname) {
this.hostname = hostname;
}
public String getScribeHost() {
return scribeHost;
}
public void setScribeHost(String scribeHost) {
this.scribeHost = scribeHost;
}
public int getScribePort() {
return scribePort;
}
public void setScribePort(int scribePort) {
this.scribePort = scribePort;
}
public String getScribeCategory() {
return scribeCategory;
}
public void setScribeCategory(String scribeCategory) {
this.scribeCategory = scribeCategory;
}
public String getEncoading() {
return encoading;
}
public void setEncoading(String encoading) {
this.encoading = encoading;
}
@Override
public void activateOptions() {
super.activateOptions();
synchronized (this) {
ScribeAppender scribeAppender = new ScribeAppender();
scribeAppender.setLayout(getLayout());
scribeAppender.setHostname(getHostname());
scribeAppender.setScribeHost(getScribeHost());
scribeAppender.setScribePort(getScribePort());
scribeAppender.setScribeCategory(getScribeCategory());
scribeAppender.setEncoding(getEncoading());
scribeAppender.activateOptions();
addAppender(scribeAppender);
}
}
@Override
public boolean requiresLayout() {
return true;
}
}
ScribeAppender .java
package com.logtest;
import net.scribe.LogEntry;
import net.scribe.scribe;
import org.apache.log4j.WriterAppender;
import org.apache.log4j.spi.LoggingEvent;
import org.apache.thrift.protocol.TBinaryProtocol;
import org.apache.thrift.transport.TFramedTransport;
import org.apache.thrift.transport.TSocket;
import org.apache.thrift.transport.TTransportException;
import java.util.List;
import java.util.ArrayList;
import java.net.Socket;
import java.net.UnknownHostException;
import java.net.InetAddress;
import java.io.IOException;
/**
* 继承WriterAppender
* 实现了scribe 服务器的链接和日志的发送。
* @author ninja
*/
public class ScribeAppender extends WriterAppender {
private String hostname;
private String scribeHost;
private int scribePort;
private String scribeCategory;
private String encoding;
private List<LogEntry> logEntries;
private scribe.Client client;
private TFramedTransport transport;
public String getHostname() {
return hostname;
}
public void setHostname(String hostname) {
this.hostname = hostname;
}
public String getScribeHost() {
return scribeHost;
}
public void setScribeHost(String scribeHost) {
this.scribeHost = scribeHost;
}
public int getScribePort() {
return scribePort;
}
public void setScribePort(int scribePort) {
this.scribePort = scribePort;
}
public String getScribeCategory() {
return scribeCategory;
}
public void setScribeCategory(String scribeCategory) {
this.scribeCategory = scribeCategory;
}
public String getEncoding() {
return encoding;
}
public void setEncoding(String encoding) {
this.encoding = encoding;
}
/*
* Activates this Appender by opening a transport to the Scribe server.
*/
@Override
public void activateOptions() {
try {
synchronized (this) {
if (hostname == null) {
try {
hostname = InetAddress.getLocalHost()
.getCanonicalHostName();
} catch (UnknownHostException e) {
// can't get hostname
}
}
System.out.println(scribeHost + scribePort + scribeCategory + encoding);
// Thrift boilerplate code
logEntries = new ArrayList<LogEntry>(1);
TSocket sock = new TSocket(new Socket(scribeHost, scribePort));
transport = new TFramedTransport(sock);
TBinaryProtocol protocol = new TBinaryProtocol(transport,
false, false);
client = new scribe.Client(protocol, protocol);
// This is commented out because it was throwing Exceptions for
// no good reason.
// transport.open();
}
} catch (TTransportException e) {
e.printStackTrace();
} catch (UnknownHostException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
}
/*
* Appends a log message to Scribe
*/
@Override
public void append(LoggingEvent event) {
synchronized (this) {
try {
String message = String.format("%s %s", hostname, layout
.format(event));
LogEntry entry = new LogEntry(scribeCategory, message);
logEntries.add(entry);
client.Log(logEntries);
} catch (Exception e) {
e.printStackTrace();
} finally {
logEntries.clear();
}
}
}
@Override
public void close() {
if (transport != null) {
transport.close();
}
}
@Override
public boolean requiresLayout() {
return true;
}
}
log4j.properties
#1 \u5b9a\u4e49\u4e86\u4e24\u4e2a\u8f93\u51fa\u7aef
log4j.rootLogger = DEBUG,CONSOLE,scribe
log4j.addivity.org.apache=true
log4j.appender.CONSOLE = org.apache.log4j.ConsoleAppender
log4j.appender.CONSOLE.layout = org.apache.log4j.PatternLayout
log4j.appender.CONSOLE.layout.ConversionPattern = %-4r [%t] %-5p %c - %m%n
log4j.logger.com.vmars= DEBUG, scribe
log4j.appender.scribe= com.logtest.AsyncScribeAppender
log4j.appender.scribe.encoading=utf-8
log4j.appender.scribe.hostname=scribe
log4j.appender.scribe.scribeHost=192.168.2.221
log4j.appender.scribe.scribePort=1463
log4j.appender.scribe.hostname=ninja
log4j.appender.scribe.scribeCategory=scribe
log4j.appender.scribe.layout=org.apache.log4j.PatternLayout
log4j.appender.scribe.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{2}: %m%n
测试客户端:
package com.logtest;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
public class LogTest {
private static Log log = LogFactory.getLog(LogTest.class);
public static void main(String[] args) {
log.error("this is a charactor test ");
log.debug("这是中文测试");
log.fatal("fatal error 致命错误!!");
}
}
相关推荐
这个文档是《云计算之Flume+Kafka+Storm+Redis/Hbase+Hadoop+Hive+Mahout+Spark 技术文档分享V1.0.0》系列的一部分,涵盖了多种云计算技术。 首先,Hadoop-2.2.0是一个开源的分布式计算框架,其核心由HDFS(Hadoop ...
赠送jar包:hadoop-yarn-client-2.6.5.jar; 赠送原API文档:hadoop-yarn-client-2.6.5-javadoc.jar; 赠送源代码:hadoop-yarn-client-2.6.5-sources.jar; 赠送Maven依赖信息文件:hadoop-yarn-client-2.6.5.pom;...
综上所述,通过结合使用 Scribe、Hadoop、Log4j、Hive 和 MySQL 这些工具和技术,可以构建一套完整的日志数据采集、存储、处理和分析的解决方案。这对于深入理解用户行为、提升系统性能等方面都具有重要意义。
Hadoop-Eclipse-Plugin-3.1.1是一款专为Eclipse集成开发环境设计的插件,用于方便地在Hadoop分布式文件系统(HDFS)上进行开发和调试MapReduce程序。这款插件是Hadoop生态系统的组成部分,它使得Java开发者能够更加...
此教程来自于王家林免费发布的3本Hadoop教程:云计算分布式大数据Hadoop实战高手之路(共3本书):1,王家林编写的“云计算分布式大数据Hadoop实战高手之路---从零开始”带领您无痛入门Hadoop并能够处理Hadoop工程师...
hadoop-annotations-3.1.1.jar hadoop-common-3.1.1.jar hadoop-mapreduce-client-core-3.1.1.jar hadoop-yarn-api-3.1.1.jar hadoop-auth-3.1.1.jar hadoop-hdfs-3.1.1.jar hadoop-mapreduce-client-hs-3.1.1.jar ...
hadoop-eclipse-plugin-2.7.4.jar和hadoop-eclipse-plugin-2.7.3.jar还有hadoop-eclipse-plugin-2.6.0.jar的插件都在这打包了,都可以用。
配置组合:ubuntu+eclipse3.7.1+hadoop-1.0.0+hadoop-eclipse-plugin-1.0.0.jar 已配置测试成功过。。
hadoop-eclipse-plugin-3.1.3,eclipse版本为eclipse-jee-2020-03
在这个"apache-hadoop-3.1.3-winutils-master.zip"压缩包中,包含了在Windows环境下配置Hadoop HDFS客户端所需的组件,特别是`hadoop-winutils`和`hadoop.dll`这两个关键文件,它们对于在Windows系统上运行Hadoop...
1. **数据读写**:Flink通过Hadoop的InputFormat和OutputFormat接口,可以读取和写入Hadoop支持的各种数据源,如HDFS、HBase等。这使得Flink可以方便地访问Hadoop生态系统中的存储系统,进行大规模的数据处理。 2. ...
common-2.2.0-bin-master(包含windows端开发Hadoop和Spark需要的winutils.exe),Windows下IDEA开发Hadoop和Spark程序会报错,原因是因为如果本机操作系统是windows,在程序中使用了hadoop相关的东西,比如写入文件到...
在eclipse中搭建hadoop环境,需要安装hadoop-eclipse-pulgin的插件,根据hadoop的版本对应jar包的版本,此为hadoop3.1.2版本的插件。
hadoop2 lzo 文件 ,编译好的64位 hadoop-lzo-0.4.20.jar 文件 ,在mac 系统下编译的,用法:解压后把hadoop-lzo-0.4.20.jar 放到你的hadoop 安装路径下的lib 下,把里面lib/Mac_OS_X-x86_64-64 下的所有文件 拷到 ...
4. 工具集:Hadoop Common还包含了多种实用工具,如fs命令行工具、日志聚合服务Log4j等,方便用户管理和维护Hadoop集群。 二、配置本地master环境 在进行Hadoop编程时,通常需要搭建一个本地master环境,以便进行...
在本课程"云计算分布式大数据Hadoop实战之路--从零开始(第1-10讲)"中,我们将深入探讨云计算、分布式系统以及大数据处理的核心技术——Hadoop。这个系列的讲座旨在为初学者提供一个全面的入门指南,帮助他们理解和...
找不到与hadoop-2.9.2版本对应的插件,手动生成的hadoop-eclipse-plugin-2.9.2版本,
hadoop-eclipse-plugin-2.7.1.jar插件,直接放在eclipse插件目录中
- **日志管理**: 提供了日志收集和处理的功能,如Log4j,帮助监控和调试Hadoop集群。 在实际部署Hadoop时,需要正确配置环境变量,包括HADOOP_HOME、PATH,以及指向配置文件的路径。同时,winutils.exe需要有正确的...