abstract,spark can be compiled with:
maven,
sbt,
intellj ideal
also,if u want to load spark-project into eclipse ,then it is necessary to make a 'eclipse project' first by one of below solutions:
1.mvn eclipse:eclipse [optional]
2. ./sbt/sbt clean compile package
or sbt/sbt then input 'eclipse'
e.g. if u want build a hadoop 2.5.2 jar u can run with :
mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.5.2 -DskipTests clean package
yarn's/hadoop profile:2.4
yarn/hadoop version:2.5.2
hive:enable default
[INFO] Building jar: /home/user/build-spark/spark-1.6.1/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-1.6.1-test-sources.jar [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Spark Project Parent POM ........................... SUCCESS [01:11 min] [INFO] Spark Project Test Tags ............................ SUCCESS [ 33.845 s] [INFO] Spark Project Launcher ............................. SUCCESS [ 33.728 s] [INFO] Spark Project Networking ........................... SUCCESS [ 13.160 s] [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 10.966 s] [INFO] Spark Project Unsafe ............................... SUCCESS [ 16.310 s] [INFO] Spark Project Core ................................. SUCCESS [04:29 min] [INFO] Spark Project Bagel ................................ SUCCESS [ 6.629 s] [INFO] Spark Project GraphX ............................... SUCCESS [ 23.603 s] [INFO] Spark Project Streaming ............................ SUCCESS [ 49.411 s] [INFO] Spark Project Catalyst ............................. SUCCESS [01:15 min] [INFO] Spark Project SQL .................................. SUCCESS [01:53 min] [INFO] Spark Project ML Library ........................... SUCCESS [02:02 min] [INFO] Spark Project Tools ................................ SUCCESS [ 2.954 s] [INFO] Spark Project Hive ................................. SUCCESS [01:39 min] [INFO] Spark Project Docker Integration Tests ............. SUCCESS [ 23.671 s] [INFO] Spark Project REPL ................................. SUCCESS [ 12.683 s] [INFO] Spark Project YARN Shuffle Service ................. SUCCESS [ 7.344 s] [INFO] Spark Project YARN ................................. SUCCESS [ 21.928 s] [INFO] Spark Project Assembly ............................. SUCCESS [02:05 min] [INFO] Spark Project External Twitter ..................... SUCCESS [ 12.928 s] [INFO] Spark Project External Flume Sink .................. SUCCESS [ 20.712 s] [INFO] Spark Project External Flume ....................... SUCCESS [ 10.053 s] [INFO] Spark Project External Flume Assembly .............. SUCCESS [ 3.930 s] [INFO] Spark Project External MQTT ........................ SUCCESS [01:49 min] [INFO] Spark Project External MQTT Assembly ............... SUCCESS [ 8.549 s] [INFO] Spark Project External ZeroMQ ...................... SUCCESS [ 11.317 s] [INFO] Spark Project External Kafka ....................... SUCCESS [ 24.706 s] [INFO] Spark Project Examples ............................. SUCCESS [03:14 min] [INFO] Spark Project External Kafka Assembly .............. SUCCESS [ 9.263 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 25:38 min [INFO] Finished at: 2016-03-24T15:58:43+08:00 [INFO] Final Memory: 207M/6072M
./bin/spark-submit --master local --class org.apache.spark.examples.JavaWordCount examples/target/spark-examples_2.10-1.6.1.jar CHANGES.txt
note:
-the spark.ui.port is used by 'spark-shell'(or spark-submit?no) to show info about jobs,stages,executors,environments(these will be shown also by any completed apps) etc.related logs will be printed when it starts up:
15/11/11 15:12:45 INFO Utils: Successfully started service 'SparkUI' on port 4040. 15/11/11 15:12:45 INFO SparkUI: Started SparkUI at http://192.168.1.138:4040 15/11/11 15:12:45 INFO Executor: Starting executor ID driver on host localhost
also ,if u want to make a dist tar ,below is a feasible demo:
./make-distribution.sh -Phadoop-2.4 -Dhadoop.version=2.5.2 -DskipTests
if u want to build a spark/import project,some steps are listed here[1]
ref:
http://blog.csdn.net/yunlong34574/article/details/39213503
https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IDESetup
http://blog.csdn.net/chenxingzhen001/article/details/25901237
[1] ide setup
相关推荐
在"color-compile-master"这个压缩包中,可能包含以下内容: 1. `color-compile`脚本:这是一个可执行文件或shell脚本,用于拦截`gcc/g++/make`的输出并添加颜色。 2. `README`文件:提供关于如何安装和使用`color-...
const PostCompileWebpackPlugin = require('post-compile-webpack-plugin'); module.exports = { // 其他Webpack配置... plugins: [ // 在这里添加插件 new PostCompileWebpackPlugin({ // 配置你的自定义...
【js-conditional-compile-loader 1.0.15】是一个专为JavaScript代码条件编译设计的加载器,用于处理项目中的环境特定代码。在软件开发中,有时我们需要根据不同的运行环境(例如开发、测试和生产)来编译不同的代码...
标题中的"elasticsearch-head-compile-after.tar.gz"和"node-v8.1.2-linux-x64.ta"分别指的是两个不同的软件组件。首先,Elasticsearch是一个流行的开源搜索引擎和分析平台,广泛应用于大数据检索、日志分析和实时...
3、apr-util-1.6.1.tar 报错: fatal error:expat.h:no such file or directory #include compile interrupt make[1]: *** [xml/apr_xml.lo] error 1 make[1]: 离开目录“/usr/local/apr-util-1.6.1” make: **...
《Kotlin-Compile-Testing:测试Kotlin与Java编译环节的利器》 在软件开发过程中,确保编译器插件、注解处理器以及代码生成的正确性是至关重要的。Kotlin-Compile-Testing库正是为此目的而生,它提供了一个强大的...
v8编译缓存 v8-compile-cache附加了一个require钩子,以使用来加快实例化时间。 “代码缓存”是V8进行的解析和编译工作。 中引入了利用V8生成/使用此缓存的。用法添加依赖项: $ npm install --save v8-compile-...
color-compile 在 Linux 终端中使用 gcc/g /make 编译软件时,可以显示带颜色的 error/note/warning 效果对比如下: 使用 color-compile 后: 标签:color
在`frida-compile-master`这个压缩包中,可能包含了`frida-compile`项目的源代码。这通常包括以下几个部分: 1. `src`目录:存放`frida-compile`的源代码,你可以查看其内部实现和工作原理。 2. `package.json`:...
1. **解压源码**:首先,将 "elasticsearch-head-compile-after.tar.gz" 解压到你的工作目录,例如 `tar -zxvf elasticsearch-head-compile-after.tar.gz`。 2. **运行elasticsearch-head**:解压后,找到包含 ...
conditional-compile-loader conditional-compile-loader 根据设定的参数对 vue、js、jsx 和 css(less, sass 等) 代码进行条件编译。 安装 先安装 conditional-compile-loader npm install -D conditional-compile-...
3D-blender-compile.zip,编译Blender 2.8的Docker环境,3D建模使用专门的软件来创建物理对象的数字模型。它是3D计算机图形的一个方面,用于视频游戏,3D打印和VR,以及其他应用程序。
webpack-post-compile-plugin 一个webpack post编译插件。 它用于在node_modules中包括后编译模块。 安装 npm i webpack-post-compile-plugin --save-dev 用法 const PostCompilePlugin = require ( 'webpack-post-...
《color-compile源码解析:让gcc/g++/make编译过程更加醒目》 在编程过程中,使用gcc/g++/make进行源代码编译时,我们常常会遇到各种错误、警告和提示信息。这些信息对于查找和修复问题至关重要,但默认情况下,...
1、直接将编译好的包openssl1.1.1i-compile.tar上传包至/opt 2、解压 tar -zxvf openssl1.1.1i-compile.tar.gz 3、将解压出来的openssl移动到/usr/local下 mv openssl /usr/local/ 4、备份原来的,设置软链(参考我...
gulp-compile-handlebars, 从把手文件编译模板 gulp-compile-handlebars从 gulp模板,由 grunt-compile-handlebars激发。编译车把模板模板安装安装使用 npmnpm install --save-dev gulp-comp
v8-compile-cache-lib fork作为API公开,可在其他库和工具中以编程方式使用。 v8-compile-cache-lib附加了一个require钩子,以使用来加快实例化时间。 “代码缓存”是V8进行的解析和编译工作。 中引入了利用V8...
哈工大2022春编译系统(HIT-Compile_System)资料(实验期末复习资料)_HIT-Compile-2022Spring_HIT-Compile-2022Spring.zip
brainfuck-compile