`

idea打包sbt项目

    博客分类:
  • idea
 
阅读更多

转载:https://blog.csdn.net/coder__cs/article/details/79344839

 

前提条件是创建好了wordcount项目,可以参考Scala官方IDE教程

 

Getting Started with Scala in IntelliJ

Building a Scala Project with IntelliJ and sbt

Spark Quick Start https://spark.apache.org/docs/latest/quick-start.html#self-contained-applications

 

我们需要做的是打包项目为jar包并发布到集群中去运行,下面逐一讲解打包步骤。 

第一步 打开项目结构配置页面 可以使用快捷键 Ctrl+Alt+Shift+S 

 

第二步 添加jar包配置 

 

第三步 去除额外的lib包依赖,不将其他依赖打包到jar文件中,只保留class编译文件及META-INF文件夹 

 

 

第四步 编译构建 生成jar包 

 

 

 

 

 

● jar包解压后内部结构 

 

第五步 最后将得到的jar包,可上传到集群中执行

 

sftp> put wordcount.jar

Uploading wordcount.jar to /home/elon/workspace/wordcount/wordcount.jar

  100% 5KB      5KB/s 00:00:00     

C:/Users/yilon/Documents/wordcount.jar: 5452 bytes transferred in 0 seconds (5452 bytes/s)

1

2

3

4

 

[elon@hadoop spark]$ ./bin/spark-submit --class WordCount --master local ~/workspace/wordcount/wordcount.jar 

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

18/02/22 00:38:59 INFO SparkContext: Running Spark version 2.2.1

18/02/22 00:39:01 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

18/02/22 00:39:01 INFO SparkContext: Submitted application: WordCount

18/02/22 00:39:01 INFO SecurityManager: Changing view acls to: elon

18/02/22 00:39:01 INFO SecurityManager: Changing modify acls to: elon

18/02/22 00:39:01 INFO SecurityManager: Changing view acls groups to: 

18/02/22 00:39:01 INFO SecurityManager: Changing modify acls groups to: 

18/02/22 00:39:01 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(elon); groups with view permissions: Set(); users  with modify permissions: Set(elon); groups with modify permissions: Set()

18/02/22 00:39:02 INFO Utils: Successfully started service 'sparkDriver' on port 44048.

18/02/22 00:39:02 INFO SparkEnv: Registering MapOutputTracker

18/02/22 00:39:02 INFO SparkEnv: Registering BlockManagerMaster

18/02/22 00:39:02 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information

18/02/22 00:39:02 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up

18/02/22 00:39:02 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-df8c9e80-53ba-42e5-98f9-6010211bac1c

18/02/22 00:39:02 INFO MemoryStore: MemoryStore started with capacity 413.9 MB

18/02/22 00:39:03 INFO SparkEnv: Registering OutputCommitCoordinator

18/02/22 00:39:03 INFO Utils: Successfully started service 'SparkUI' on port 4040.

18/02/22 00:39:03 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.1.115:4040

18/02/22 00:39:04 INFO SparkContext: Added JAR file:/home/elon/workspace/wordcount/wordcount.jar at spark://192.168.1.115:44048/jars/wordcount.jar with timestamp 1519231144007

18/02/22 00:39:04 INFO Executor: Starting executor ID driver on host localhost

18/02/22 00:39:04 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 32785.

18/02/22 00:39:04 INFO NettyBlockTransferService: Server created on 192.168.1.115:32785

18/02/22 00:39:04 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy

18/02/22 00:39:04 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.1.115, 32785, None)

18/02/22 00:39:04 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.115:32785 with 413.9 MB RAM, BlockManagerId(driver, 192.168.1.115, 32785, None)

18/02/22 00:39:04 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.1.115, 32785, None)

18/02/22 00:39:04 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.1.115, 32785, None)

wordCounts: 

(package,1)

(For,3)

(Programs,1)

(processing.,1)

(Because,1)

(The,1)

(page](http://spark.apache.org/documentation.html).,1)

(cluster.,1)

(its,1)

([run,1)

(than,1)

(APIs,1)

(have,1)

(Try,1)

(computation,1)

(through,1)

(several,1)

(This,2)

(graph,1)

(Hive,2)

(storage,1)

(["Specifying,1)

(To,2)

("yarn",1)

(Once,1)

(["Useful,1)

(prefer,1)

(SparkPi,2)

(engine,1)

(version,1)

(file,1)

(documentation,,1)

(processing,,1)

(the,24)

(are,1)

(systems.,1)

(params,1)

(not,1)

(different,1)

(refer,2)

(Interactive,2)

(R,,1)

(given.,1)

(if,4)

(build,4)

(when,1)

(be,2)

(Tests,1)

(Apache,1)

(thread,1)

(programs,,1)

(including,4)

(./bin/run-example,2)

(Spark.,1)

(package.,1)

(1000).count(),1)

(Versions,1)

(HDFS,1)

(Data.,1)

(>>>,1)

(Maven,1)

(programming,1)

(Testing,1)

(module,,1)

(Streaming,1)

(environment,1)

(run:,1)

(Developer,1)

(clean,1)

(1000:,2)

(rich,1)

(GraphX,1)

(Please,4)

(is,6)

(guide](http://spark.apache.org/contributing.html),1)

(run,7)

(URL,,1)

(threads.,1)

(same,1)

(MASTER=spark://host:7077,1)

(on,7)

(built,1)

(against,1)

([Apache,1)

(tests,2)

(examples,2)

(at,2)

(optimized,1)

(3"](https://cwiki.apache.org/confluence/display/MAVEN/Parallel+builds+in+Maven+3).,1)

(usage,1)

(development,1)

(Maven,,1)

(graphs,1)

(talk,1)

(Shell,2)

(class,2)

(abbreviated,1)

(using,5)

(directory.,1)

(README,1)

(computing,1)

(overview,1)

(`examples`,2)

(example:,1)

(##,9)

(N,1)

(set,2)

(use,3)

(Hadoop-supported,1)

(running,1)

(find,1)

(contains,1)

(project,1)

(Pi,1)

(need,1)

(or,3)

(Big,1)

(high-level,1)

(Java,,1)

(uses,1)

(<class>,1)

(Hadoop,,2)

(available,1)

(requires,1)

((You,1)

(more,1)

(see,3)

(Documentation,1)

(of,5)

(tools,1)

(using:,1)

(cluster,2)

(must,1)

(supports,2)

(built,,1)

(tests](http://spark.apache.org/developer-tools.html#individual-tests).,1)

(system,1)

(build/mvn,1)

(Hadoop,3)

(this,1)

(Version"](http://spark.apache.org/docs/latest/building-spark.html#specifying-the-hadoop-version),1)

(particular,2)

(Python,2)

(Spark,16)

(general,3)

(YARN,,1)

(pre-built,1)

([Configuration,1)

(locally,2)

(library,1)

(A,1)

(locally.,1)

(sc.parallelize(1,1)

(only,1)

(Configuration,1)

(following,2)

(basic,1)

(#,1)

(changed,1)

(More,1)

(which,2)

(learning,,1)

(first,1)

(./bin/pyspark,1)

(also,4)

(info,1)

(should,2)

(for,12)

([params]`.,1)

(documentation,3)

([project,1)

(mesos://,1)

(Maven](http://maven.apache.org/).,1)

(setup,1)

(<http://spark.apache.org/>,1)

(latest,1)

(your,1)

(MASTER,1)

(example,3)

(["Parallel,1)

(scala>,1)

(DataFrames,,1)

(provides,1)

(configure,1)

(distributions.,1)

(can,7)

(About,1)

(instructions.,1)

(do,2)

(easiest,1)

(no,1)

(project.,1)

(how,3)

(`./bin/run-example,1)

(started,1)

(Note,1)

(by,1)

(individual,1)

(spark://,1)

(It,2)

(tips,,1)

(Scala,2)

(Alternatively,,1)

(an,4)

(variable,1)

(submit,1)

(-T,1)

(machine,1)

(thread,,1)

(them,,1)

(detailed,2)

(stream,1)

(And,1)

(distribution,1)

(review,1)

(return,2)

(Thriftserver,1)

(developing,1)

(./bin/spark-shell,1)

("local",1)

(start,1)

(You,4)

(Spark](#building-spark).,1)

(one,3)

(help,1)

(with,4)

(print,1)

(Spark"](http://spark.apache.org/docs/latest/building-spark.html).,1)

(data,1)

(Contributing,1)

(in,6)

(-DskipTests,1)

(downloaded,1)

(versions,1)

(online,1)

(Guide](http://spark.apache.org/docs/latest/configuration.html),1)

(builds,1)

(comes,1)

(Tools"](http://spark.apache.org/developer-tools.html).,1)

([building,1)

(Python,,2)

(Many,1)

(building,2)

(Running,1)

(from,1)

(way,1)

(Online,1)

(site,,1)

(other,1)

(Example,1)

([Contribution,1)

(analysis.,1)

(sc.parallelize(range(1000)).count(),1)

(you,4)

(runs.,1)

(Building,1)

(higher-level,1)

(protocols,1)

(guidance,2)

(a,8)

(guide,,1)

(name,1)

(fast,1)

(SQL,2)

(that,2)

(will,1)

(IDE,,1)

(to,17)

(get,1)

(,71)

(information,1)

(core,1)

(web,1)

("local[N]",1)

(programs,2)

(option,1)

(MLlib,1)

(["Building,1)

(contributing,1)

(shell:,2)

(instance:,1)

(Scala,,1)

(and,9)

(command,,2)

(package.),1)

(./dev/run-tests,1)

(sample,1)

--------------------- 

作者:Coder__CS 

来源:CSDN 

原文:https://blog.csdn.net/coder__cs/article/details/79344839 

版权声明:本文为博主原创文章,转载请附上博文链接!

分享到:
评论

相关推荐

    用sbt构造好的Intellij版的spark工程

    - IntelliJ IDEA会自动识别sbt项目,选择“Import Project”,然后在弹出的对话框中选择“Create a new SBT project”。 4. **优化构建过程** - 配置sbt的代理设置,以加快依赖下载速度。在`~/.sbt/1.0/global....

    SBT Idea插件

    IDEA 是一款流行的 Java 和 Scala 开发集成环境,而 SBT 插件是它的扩展,能够无缝地在 IDEA 中管理和运行 Scala 项目的构建任务。通过这个插件,你可以直接在 IDE 内部创建、编辑和管理 SBT 构建定义,执行测试,...

    Intellj idea 2017.1.1 sbt 插件和Scala 插件

    IDE会自动创建默认的`build.sbt`文件,这是SBT项目的核心配置文件。 3. **离线安装**: 如果网络环境不稳定,可以考虑离线安装SBT。首先下载所需的SBT版本及其库到本地,然后在IntelliJ IDEA的SBT设置中指定本地路径...

    SBT 1.2.3最新版

    4. **命令行接口**:SBT提供了一个交互式的命令行界面,用户可以运行各种任务,如`compile`(编译)、`test`(运行测试)、`package`(打包)等,还可以自定义新的任务。 5. **持续集成友好**:SBT的输出格式适合...

    构建工具 sbt-1.0.3

    10. **集成IDE**:sbt与许多流行的Scala IDE,如IntelliJ IDEA和Eclipse,有良好的集成,可以方便地导入sbt项目进行开发。 总的来说,sbt-1.0.3作为一个重要的构建工具版本,为Scala开发者提供了强大的构建和管理...

    sbt-0.13.15

    Sbt提供了丰富的命令行接口,允许开发者执行各种构建任务,如`compile`(编译源代码)、`test`(运行测试)、`run`(运行项目)、`package`(打包应用)等。此外,用户还可以自定义任务,满足特定需求。 ### 5. ...

    sbt 1.2.7 windows版本

    5. **使用场景**:sbt在Scala开发中主要用于编译、打包、测试和发布项目。通过定义构建配置文件`build.sbt`,用户可以定制项目的构建过程,包括依赖管理、源代码组织、编译选项等。 6. **与其他工具的集成**:sbt...

    sbt-1.1.4 for windows.zip

    SBT(Simple Build Tool)是Scala编程语言生态系统中的核心构建工具,它为Scala开发者提供了一种高效、灵活的方式来管理项目依赖、编译源代码、运行测试以及打包应用程序。SBT的强大之处在于它能够自动化构建过程,...

    Sbt-1.2.8(最新版64位)

    Sbt与许多流行的IDE(如IntelliJ IDEA、Eclipse)有良好的集成,可以通过导入Sbt项目设置快速开始开发工作,享受代码自动完成、重构、调试等便利。 ### 6. 性能优化 Sbt 1.2.8版本对性能进行了优化,包括更快的...

    sbt-1.3.4.msi.rar

    - **自动化构建**: SBT支持编译、打包、测试等常规构建任务,可以通过简单的命令行指令执行。 - **依赖管理**: SBT使用 Ivy 库管理依赖,可以自动下载和更新项目所需的库,包括本地和远程Maven仓库的依赖。 - **...

    Simple Build Tool(sbt)0.13.8最新版.rar

    8. **整合IDE**: sbt与许多流行的IDE(如IntelliJ IDEA、Eclipse)有良好的集成,便于开发和调试。 9. **持续集成**: sbt与持续集成工具如Jenkins、Travis CI等配合良好,可以自动化构建和测试流程。 10. **发布...

    sbt-0.13.16(包含tgz,msi,zip三种格式)

    0.13.16版本与IDEA的最新版本兼容,这意味着开发者可以在IDEA中无缝地使用SBT进行项目构建、测试和打包等操作。SBT的集成使得开发过程更为流畅,因为它提供了自动编译、实时反馈和智能代码导航等功能。 在实际开发...

    IDEA的scala插件

    通过SBT,开发者可以管理项目的依赖、编译、打包和运行等任务。IDEA的Scala插件与SBT紧密集成,实时刷新项目结构,并支持在IDE内直接运行和调试测试。 在编写Scala代码时,插件提供了丰富的代码补全功能,包括类、...

    前端开源库-sbt-cli

    SBT提供良好的IDE支持,如IntelliJ IDEA和Eclipse,通过插件可以无缝集成SBT构建系统,实现代码的自动编译和测试。 **9. 性能优化** SBT采用增量编译策略,只编译自上次构建以来发生变化的源文件,从而提高了构建...

    idea-scala插件

    对于SBT(Scala Build Tool)项目,插件提供了内置的支持,可以直接在IDEA中运行测试、打包应用、管理依赖等。对于Maven或Gradle项目,尽管它们不是专门针对Scala的,Idea-Scala插件也能很好地处理。 ### 八、测试...

    scala PLAY 框架 sbt仓库

    在Play框架中,sbt(Simple Build Tool)是默认的构建工具,它允许开发者管理项目依赖、编译代码、运行测试以及打包应用程序。 sbt仓库是sbt用来存储和检索项目依赖的地方。在默认情况下,sbt会从Maven中央仓库和...

    sbt-class-diagram:用于创建类图的sbt插件

    sbt 插件是 sbt 生态系统的重要组成部分,可以扩展 sbt 的功能,如编译、测试、打包等。`sbt-class-diagram` 插件就是这样一个扩展,它专门用于生成项目中的类图,便于开发者进行代码分析。 **二、类图的重要性** ...

Global site tag (gtag.js) - Google Analytics