`
diy8187
  • 浏览: 79858 次
  • 性别: Icon_minigender_1
  • 来自: 深圳
文章分类
社区版块
存档分类
最新评论

Spark源代码编译生成全攻略

阅读更多

本文来源:http://bookbookpicture.spaces.live.com/blog/cns!68F3076C3C3DA5EB!671.entry

其实关于这个问题在Spark的官网 www.igniterealtime.org 上有很详尽的介绍,因此本文大部分内容是从英文文档引用而来的,其中还有一些个人的经验。

Spark源代码:下载地址
想了解更多关于"Spark"的文章,请点击这里 .

  1. 安装JDK
    这个不用说了,注意版本,最少要1.5,推荐使用
  2. 安装Eclipse3.3
    a) 从官网下载Eclipse 3.3 (对Java开发者用的)
    b) 假设你把eclipse安装在c:/program files/eclipse,进入这个文件夹,为eclipse.exe创造一个桌面图标,右击这个图标,选择“属性”,打开属性对话框,在“目标”的输入框里,输入如下
    "C:\Program Files\Eclipse\eclipse.exe" -vm "C:\Program Files\Java\jdk1.6.0\bin\javaw"
    熟悉eclipse的都知道这是为eclipse指定使用哪个Java VM。
  3. 为eclipse安装Subversive插件
    a)用上面建的图标打开eclipse,下面开始安装Subversive插件,由于我用的是英文版的ecplipse,
    下面的菜单我都用英文。
    b) 点击
    Help::Software Updates::Find and Install...
    c)点击 Search for new features to install ,点
    Next
    d)点New Remote Site... 按钮
    e)在name的输入框里输入 Subversive ,并且在URL输入框里输入
    http://www.polarion.org/projects/subversive/download/1.1/update-site
    (最新的Subversive 地址上 http://www.eclipse.org/subversive 查询)
    f)点击Finish ,开始安装Subversive,eclipse将搜索网站,并且在下一个窗口中显示你想安装的功能
    选择安装Subversive SVN Team Provider Plugin 和
    Subversive Client Libraries下面所有的功能
    g) 点Next,eclipse开始安装过程,安装结束后重启eclipse。
  4. 利用svn方式下载spark代码
    a)点击如下
    Windows::Open Perspective::Other...
    b)弹出一个“Open Perspective”对话框,选择“SVN Repository Exploring”,单击OK
    c)这是eclipse界面发生变化,在左边的“ SVN Repositories”面板上,右击鼠标
    选择New::Repository Location...
    d)在“New Repository Location”的位置输入
    http://svn.igniterealtime.org/svn/repos ”,单击 “Finish”
    e)在
    SVN Repositories面板上,会发生变化,展开它,找到spark的选项,右击
    spark下面的trunk项,选择“Check Out”,下载spark的代码。
    f)下载完成后,选择Window::Open Perspective::Java,在Project Explorer面板上,
    看到Spark项目,删掉它,在弹出来的对话框中选择“Do not delete contents”
    在工作目录下面找到spark文件夹,里面就是spark的源代码。
    注:也可以不要这么麻烦,我在上面提供了一个链接,可以直接下载Spark源代码,上面的步骤都
    可以省了.
  5. 创建Spark项目
    1)点击Window::Open Perspective::Java菜单
    2)在Project Explorer窗口中,如果有spark这个项目,把它删了,删除时,会问你要不要删除
    文件,选择不要.
    3)选择File::New::Project...,再选择Java::Java Project,在New Java Project窗口选择
    "Create project from existiing source",然后把spark文件所在的文件夹加进去.
    4)在"project name"中输入spark,要和文件夹的名字相同.
    5)点Finish.
  6. 生成Spark
    1)点击Window::Show View::Ant
    2)右击Ant面板,选择Add Buildfiles
    3)展开spark::build文件夹,选择build.xml,点击"OK"
    4)在Ant面板,展开Spark,双击"release",等一段时间,会提示"Build Successful".
  7. Create Project Builder
    1)点击Run::Open Debug Dialog...,出现"Run"窗口
    2)选择"Java Application",点击"New"按钮.
    3)在"Main"标签页,将New_configuration换成Spark或其它的这个无所谓.
    4)点击Project::Browse按钮,选择Spark,再点OK.
    5)点击Main class::Search按钮,选择main所在的类Startup-org.jivesoftware.launcher,
    再点击OK
    6)建议勾选Stop in main.
    7)点击Classpath标签页,选择User Entries ,使得Advanced..按钮变的可用.点击Advanced
    按钮.
    8)在弹出来的Advanced Options窗口,选择Add Folders,再点OK,在Folder Selection窗口
    选择spark::src::resources 文件夹,点击OK
    9)选择Common标签页,勾选Debug,Run前面的框
    10)点击Apply,再点击Close
  8. Run/Debug
    点击Run::Open Run Dialog..,在弹出的对话框选择Spark,然后点Run就行了.

英文文档来源: http://www.igniterealtime.org/community/docs/DOC-1040

英文文档如下:

This guide assumes that you are installing everything from scratch. If you've done some parts of them, this guide may still be useful. I compile this guide to the best of my knowledge. I apologize if it doesn't work for you.

Notes:

  • This guide assumes that you want the latest updates of the source i.e. from the project's trunk directory. If you only want the released/stable version, checkout the desired release from under
    the tags directory.


Install JDK

  • Download JDK and install them. The least version should be 1.5. I use 1.6. Sorry, no instruction for this.

Install Eclipse 3.3

  • Download Eclipse 3.3 from www.eclipse.org. I use Eclipse IDE for Java EE Developers . You should at least use Eclipse IDE for Java Developers .
  • Extract the downloaded zip file into C:/Program Files/Eclipse .
  • Open C:/Program Files/Eclipse folder.
  • Right click and drag eclipse.exe on to your desktop (or Windows taskbar) to create a shortcut icon.
  • Right click the shortcut icon and choose Properties . The Eclipse Properties window will show.
  • The Target textbox should read something like this "C:\Program Files\Eclipse\eclipse.exe" -vm "C:\Program Files\Java\jdk1.6.0\bin\javaw" depending on the JDK that you use and where you installed it.
  • Close the Eclipse Properties window.

Install Subversive Plugin

  • Double-click the shortcut icon to start Eclipse.
  • Select/enter your preferred workspace and click OK to open Eclipse main IDE window.
  • Click on the Workbench icon to close the welcome screen.
  • Click Help::Software Updates::Find and Install... menu.
  • Click on Search for new features to install and click Next .
  • Click on New Remote Site... button.
  • Enter Subversive in the Name box and http://www.polarion.org/projects/subversive/download/1.1/update-site in the URL box (Check the latest URL from http://www.eclipse.org/subversive website), then click OK .
  • Click Finish to install Subversive. Eclipse will search for the update site and show the result in a next window where you will select the features to install. I choose everything under Subversive SVN Team Provider Plugin and Subversive Client Libraries .
  • Click Next to continue and so on until the installation ends. You normally want to restart Eclipse when the installation ends.

Check Out Spark SVN

  • Click Windows::Open Perspective::Other... menu.
  • Click on SVN Repository Exploring on the Open Perspective window and click OK .
  • Right-click on SVN Repositories screen and choose New::Repository Location...
  • On New Repository Location enter http://svn.igniterealtime.org/svn/repos in the URL box and click Finish . You'll see the URL location in the SVN Repositories screen.
  • Expand the URL location.
  • Expand the spark tree.
  • Right-click on trunk and choose Check Out . Make yourself some Mocha while waiting for the checkout to complete.

Create Spark Project

  • Click Window::Open Perspective::Java menu.
  • In the Project Explorer screen, if there is a spark project, delete it. This project was created during the Spark check out process. Yes you read it correctly, DELETE the project!!! Otherwise you'll have to setup your Spark development environment manually. On the Confirm Project Delete choose Do not delete contents , then click Yes .
  • Click File::New::Project... Notice the ellipses!!!
  • Select Java::Java Project and click Next .
  • On the New Java Project window choose Create project from existing source and browse to where spark folder is located under your workspace.
  • In the Project name box enter exactly as spark . Otherwise, the Next and Finish button remain disabled. Click on Next . Eclipse will read the directory structure to setup the environment automatically (almost) for you and you can see what it does on the next screen. Then click on Finish .
  • If the Open Associated Perspective windows opens, click Yes.

Build Spark

  • Click Window::Show View::Ant menu.
  • Right-click the Ant screen and choose Add Buildfiles...
  • Expand the spark::build folder and select build.xml , then click OK .
  • On the Ant screen, expand the Spark and double-click on release ant task. The build may fail because you're checking out the daily updates of Spark sources, which may contain bugs. If so, wait for another day and hope that the developers discover and fix the bug; or you might dare to fix it yourself. During this first time setup, a successful build is necessary before you can proceed with the remaining tasks below.

Create Project Builder

  • Click Run::Open Run Dialog... or Run::Open Debug Dialog... menu. A Run window shows.
  • Select Java Application and click on the New button.
  • On the Main tab of the Run window, change the New_configuration name to Spark or anything you like.
  • Click on Project::Browse button and select spark and click OK .
  • Click on Main class::Search button and select Startup - org.jivesoftware.launcher and click OK .
  • I'd suggest that you select Stop in main check box so that you could later verify that debugging works.
  • Click on Classpath tab.
  • Select User Entries so that the Advanced... button will be enabled.
  • Click on the Advanced... button.
  • On the Advanced Options window select Add Folders and click OK .
  • On the Folder Selection window select spark::src::resources folder and click OK .
  • Click on Common tab.
  • Select the Debug and Run check box.
  • Click on Apply button.
  • Click on Close button.

Run/Debug

  • The setting is now complete for Spark.
  • You may test running and debugging by clicking on Run::Run History::Spark and Run::Debug History::Spark respectively. If you choose the later and if you follow this instruction closely, execution will stop in the main method of Startup.java .
分享到:
评论

相关推荐

    Spark 源码编译完整教程(maven编译+sbt编译+编译工具包+操作步骤笔记+依赖包下载镜像设置)

    自己学习并应用Spark一段时间后,由于需要改源代码,就研究了下怎么编译源码,成功编译后,记录下了编译的步骤和遇到的问题,例如编译过程中,下载大量的依赖包,默认从国外下载,非常慢,可能要编译好几个小时,...

    spark源代码部署及编译生成

    以下将详细介绍Spark源代码的获取、环境配置、编译以及生成步骤。 1. **获取Spark源代码** Spark的源代码可以通过访问Apache官方网站或使用Git克隆仓库来获取。在终端中输入以下命令克隆Spark的GitHub仓库: ``...

    hive3.x编译spark3.x包

    2. **获取源码**:从Apache官网下载Hive和Spark的源代码。对于Hive,选择3.1.x系列的分支,对于Spark,选择3.0.0或3.1.3版本,这取决于你希望编译的Hive-Spark组合。 3. **应用补丁**:描述中提到的“补丁文件包”...

    IM spark源代码部署及编译

    在介绍IM Spark源代码的部署和编译之前,首先需要了解一些前提条件和基础知识。Spark项目是一个基于Openfire的即时通讯(IM)平台,它使用Java语言开发。Eclipse是一个广泛使用的集成开发环境(IDE),用于Java开发...

    Spark源代码在Eclipse3.5.2中的部署、编译、运行.doc

    在本文中,我们将详细探讨如何在Eclipse 3.5.2中部署、编译和运行Spark源代码。首先,我们需要准备必要的软件组件,包括Openfire、Spark和Smack。 一、准备工作 1. 下载相关组件: - Openfire(版本 3.6.4) - ...

    spark编译源码过程

    访问Apache Spark官方网站[http://spark.apache.org/downloads.html](http://spark.apache.org/downloads.html)下载所需版本的源代码。本文以Spark 2.1.0为例,下载完成后,解压缩至指定目录。 #### 三、源码编阅 ...

    spark源码编译

    jdk 1.7以上哦。直接使用eclipse就可以导入了,ant编译之后,就直接可以运行。 具体情况我博客 http://blog.csdn.net/dao2012/article/details/52585152

    Spark源码包(编译完成)

    在本提供的资源中,我们有一个已经编译完成的Spark 1.6.1版本的源码包,这对于想要深入理解Spark工作原理、进行二次开发或者进行性能优化的开发者来说非常有价值。 **Spark核心组件与架构** Spark的核心组件包括:...

    spark2.0编译版-适用于hive2.3的hive on spark

    2. **编译过程**:由于Hive on Spark要求Spark不包含Hive的jar包,因此需要从源代码编译Spark。这通常涉及以下步骤: - 克隆Spark的GitHub仓库到本地。 - 修改`build.sbt`或`pom.xml`配置文件,排除Hive相关的依赖...

    Spark源代码在Eclipse中的部署、编译、运行.doc

    以上就是将Spark源代码在Eclipse中部署、编译和运行的详细步骤。通过这种方式,开发者可以直接对Spark源代码进行修改、调试和优化,深入了解其内部机制,并根据需求定制功能。请注意,由于Spark的版本不断更新,一些...

    Hive3.1.2编译源码

    使用hive3.1.2和spark...所以,如果想要使用高版本的hive和hadoop,我们要重新编译hive,兼容spark3.0.0。除了兼容spark3.0.0外,还将hive3.1.2的guava的版本进行了提升,和hadoop3.x保持一致,以便兼容hadoop3.1.3。

    centos7编译spark2.3v2生成安装包

    通过以上步骤,我们已经完成了在 CentOS 7 上从源码编译 Spark 2.3 并生成可安装包的过程。需要注意的是,在实际操作过程中可能还会遇到其他问题,例如依赖冲突、编译错误等,这需要根据具体的错误日志进行排查解决...

    Spark源码剖析

    《Spark源码剖析》PDF 文件很可能会深入到这些技术细节,包括类结构、算法实现以及关键代码的解析,帮助读者更好地理解和优化 Spark 应用。通过深入学习 Spark 源码,开发者可以更好地掌握 Spark 内部工作原理,从而...

    hbase2.3.5+spark-3.0.3源码编译包

    hbase2.3.5+spark-3.0.3源码编译包

Global site tag (gtag.js) - Google Analytics