`
sillycat
  • 浏览: 2542331 次
  • 性别: Icon_minigender_1
  • 来自: 成都
社区版块
存档分类
最新评论

Spark(4)Deal with Mesos

 
阅读更多

Spark(4)Deal with Mesos

5. Running Spark with Mesos
I am an old developer, I already have Java, Scala, and Spark on my local machine.
>java -version
java version "1.6.0_51"
Java(TM) SE Runtime Environment (build 1.6.0_51-b11-456-11M4508)
Java HotSpot(TM) 64-Bit Server VM (build 20.51-b01-456, mixed mode)
 
>scala -version
Scala code runner version 2.10.1 -- Copyright 2002-2013, LAMP/EPFL

spark is built based on the branch scala-2.10
>git status
# On branch scala-2.10

>git remote -v
origin     https://github.com/mesos/spark.git (fetch)
origin     https://github.com/mesos/spark.git (push)

5.1 Mesos
The version for Mesos right now is 0.12.0. But the version which in spark website is based on 0.9.0. I decide to use Mesos with the latest version following the document with 0.9.0. Haha.

Download mesos-0.12.0-incubating.tar.gz

Unzip this file and place in my working directory. And follow the README file.
>./configure

Error Message:
conftest.cpp:7: warning: 'JNI_CreateJavaVM' is deprecated (declared at /System/Library/Frameworks/JavaVM.framework/Headers/jni.h:1937)
configure: error: failed to build with JNI

Solution:
I do not quite understand this, but I use this command instead
>./configure --disable-java

>make
>make check
>sudo make install

Both make check failed, but it seems that make install success. But I saw nothing under /usr/local/mesos. Let do it again.
>mkdir build
>cd build
>../configure --with-python-headers=/usr/include/python2.6 --with-java-home=/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home --with-java-headers=/System/Library/Frameworks/JavaVM.framework/Headers --with-webui --with-included-zookeeper --prefix=/User/carl/tool/mesos-0.12.0 

Headers is here
/System/Library/Frameworks/JavaVM.framework/Headers
JAVA_HOME is here
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home  

>../configure --with-python-headers=/usr/include/python2.6 --with-java-home=/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home --with-java-headers=/System/Library/Frameworks/JavaVM.framework/Headers --with-webui --with-included-zookeeper --prefix=/Users/carl/tool/mesos-0.12.0 --disable-java
>make
>make install

It seems there /User/carl/tool/mesos-0.12.0.

Start mesos to verify my Installation.
Start the Master
>sbin/mesos-master.sh

Start the Slave
>sbin/mesos-slave.sh --master=127.0.0.1:5050

Watching it from the webUI
http://localhost:5050/

5.2 Start Mesos in Cluster mode
On the master node
List all the masters and slaves in the configuration files
>cd /Users/carl/tool/mesos-0.12.0/var/mesos/deploy
>cat masters
localhost
>cat slaves
localhost

On the slave node
>cd /Users/carl/tool/mesos-0.12.0/var/mesos/conf
>cp mesos.conf.template mesos.conf
>cat mesos.conf
master=localhost:5050 

Start the server
>sbin/mesos-start-cluster.sh

It works well.

Stop the server
>sbin/mesos-stop-cluster.sh

5.3 Spark On Mesos
Changes in the spark configuration
>cd /opt/spark/conf
MESOS_NATIVE_LIBRARY=/Users/carl/tool/mesos-0.12.0/lib/libmesos.dylib 
SCALA_HOME=/opt/scala2.10.0

Change the spark Context as follow:
var sparkMaster = "mesos://localhost:5050"

Run the spark job. But I get error Message as follow: 

Error Message:
Failed to load native Mesos library from .:/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java
[error] (run-main) java.lang.UnsatisfiedLinkError: no mesos in java.library.path
java.lang.UnsatisfiedLinkError: no mesos in java.library.path
     at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1758)
     at java.lang.Runtime.loadLibrary0(Runtime.java:823) 

Solution:
Haha, it is easy to understand that I am using spark ----> scala ---> Java, but I --disable-java during the configure step. That is why I get pain here.

Take a look at these documents, It figure out a way to update the JDK to 1.7.
https://github.com/airbnb/chronos/blob/master/docs/FAQ.md
http://qnalist.com/questions/2020253/problem-building-on-mac-os
http://docs.oracle.com/javase/7/docs/webnotes/install/mac/mac-jdk.html

I download the jdk from oracle jdk-7u25-macosx-x64.dmg
>java -version
java version "1.7.0_25"

>export JAVA_HOME=$(/usr/libexec/java_home -v 1.7)
And here is the JAVA_HOME
>echo $JAVA_HOME
/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home
 
Configure the mesos like this
>../configure --with-python-headers=/usr/include/python2.6 --with-java-home=/System/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home --with-java-headers=/System/Library/Frameworks/JavaVM.framework/Headers --with-webui --with-included-zookeeper --prefix=/Users/carl/tool/mesos-0.12.0 

Good, it is working.

It seems that use Mesos latest version is a bad idea. I will try to use 0.9.0-inclubating instead.
Get the file from here: http://download.nextag.com/apache/incubator/mesos/mesos-0.9.0-incubating/
>../configure --with-python-headers=/usr/include/python2.6 --with-java-home=/System/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home --with-java-headers=/System/Library/Frameworks/JavaVM.framework/Headers --with-webui --with-included-zookeeper --prefix=/Users/carl/tool/mesos-0.9.0
>make
>make install 

>sbin/mesos-master

Then the webui is from URL http://localhost:8080

>sbin/mesos-slave --master=127.0.0.1:5050

Then the webui is from URL http://localhost:8081

Still I got the Error Message
Failed to load native Mesos library from /Users/carl/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
[error] (run-main) java.lang.UnsatisfiedLinkError: no mesos in java.library.path
java.lang.UnsatisfiedLinkError: no mesos in java.library.path

Solution:
/Library/Java/Extensions
/System/Library/Java/Extensions 
/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home 

Try to do the soft link to Mesos.
>sudo ln -s /Users/carl/tool/mesos-0.9.0/lib/libmesos.dylib /Library/Java/Extensions/libmesos.dylib

That may not be caused by the version of the Mesos, but I should soft link the dylib to java system.
>sudo ln -s /Users/carl/tool/mesos-0.12.0/lib/libmesos.dylib /Library/Java/Extensions/libmesos.dylib

It is working, and we can see the logs from the webui. But I got some error message like this>
Error Message:
13/07/19 16:38:43 ERROR executor.Executor: Exception in task ID 1 java.io.OptionalDataException at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1368) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) at scala.collection.immutable.$colon$colon.readObject(List.scala:366) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606)

Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10 seconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:96) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:100) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53) at scala.concurrent.Await$.result(package.scala:107) at spark.storage.BlockManagerMaster.askDriverWithReply(BlockManagerMaster.scala:135)

Solution:
Information about the Mesos version, we have Mesos in
SPARK_HOME/project/SparkBuild.scala
SPARK_HOME/pom.xml

But I found this is working
>./run spark.examples.SparkPi mesos://localhost:5050 

I did not rebuild the spark based on Mesos 0.12.0. The root reason to cause the first Error Message is that, the scala version of my command line is 2.10.1. I switch to 2.10.0. It is ok now.

But I notice that there is still second Error Message there:
13/07/19 17:21:58 WARN storage.BlockManagerMaster: Error sending message to BlockManagerMaster in 1 attempts akka.pattern.AskTimeoutException: Timed out at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:312) at akka.actor.DefaultScheduler$$anon$8.run(Scheduler.scala:191) at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:137) at akka.dispatch.ForkJoinExecutorConfigurator$MailboxExecutionTask.exec(AbstractDispatcher.scala:506) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:262) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:975) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1478) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:104) 13/07/19 17:22:11 WARN storage.BlockManagerMaster: Error sending message to BlockManagerMaster in 2 attempts java.util.concurrent.TimeoutException: Futures timed out after [10 seconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:96) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:100) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)

Solution:
I just try this way, update the Mesos version in pom.xml and project/SparkBuild.scala and Update the Spark source codes and build spark again.

Error Message
Unexpected NOT to have spark.hostPort set

Solution:
Try to rollback to 0.9.0 version of Mesos and recompile spark. The error message still exists, it should come from the spark source codes. So I may be build and recompile it several days later.

6. Configure Mesos working with Zookeeper
come soon...

References:
Running Configuration
http://spark-project.org/docs/latest/running-on-yarn.html
http://spark-project.org/docs/latest/running-on-mesos.html
http://spark-project.org/docs/latest/spark-standalone.html

mesos
http://jm.taobao.org/2012/06/13/spark%E7%B3%BB%E5%88%97-%E5%AE%89%E8%A3%85/
http://spark-project.org/docs/latest/running-on-mesos.html

spark
http://sillycat.iteye.com/blog/1871204

mesos
http://mesos.apache.org/
https://github.com/apache/incubator-mesos/blob/trunk/docs/Home.md
https://github.com/apache/incubator-mesos/blob/79cbe52160cf58d58332e0e0ec609fd080bea151/docs/Configuration.textile
https://github.com/apache/incubator-mesos/blob/79cbe52160cf58d58332e0e0ec609fd080bea151/docs/Deploy-Scripts.textile
https://github.com/apache/incubator-mesos/blob/trunk/docs/Home.md
http://comments.gmane.org/gmane.comp.lang.scala.spark.user/1728

分享到:
评论
1 楼 nation 2017-07-11  
你好,在部署Mesos+Spark的运行环境时,出现一个现象,如果集群中防火墙关闭了,则任务正常执行,如果打开防火墙,任务就会卡死,但是网上查找资料并没有看到要设置具体的防火墙端口,请问你有没有相关的配置经验,可否告知具体要开发的防火墙端口,或可以查到的文档地址,谢谢

相关推荐

    mesos-spark:在 mesos 集群上运行 spark 作业

    金币火花 金币的火花。 用法 Docker 镜像由 spark 版本标记。 Mesos 的主 URL 格式为 mesos://host:5050(对于单主 Mesos 集群)或 mesos://zk://host:... --master <mesos> --conf spark.executor.uri=<spark d

    Paco Nathan:Spark on Elastic Mesos

    We will show a demo for how to launch a Mesos cluster and then deploy Spark. Then we’ll show a sample app based on PMML for importing predictive models from analytics frameworks such as R, SAS, SPSS,...

    Machine Learning with Spark

    Spark可以运行在Hadoop、Apache Mesos、Kubernetes、独立,或者云上,能够访问各种数据源。 书中所涉及的Spark核心(Spark Core)是整个Spark生态系统的基础,它包括了底层的分布式任务调度、内存管理、容错机制、...

    mesos-spark-源码.rar

    《Mesos与Spark源码解析》 Apache Mesos和Apache Spark是大数据处理领域中的两个重要组件,它们在分布式计算环境中发挥着关键作用。本篇将深入探讨这两个项目的源码,帮助读者理解其内部机制和工作原理。 Apache ...

    Big Data SMACK: A Guide to Apache Spark, Mesos, Akka, Cassandra, and Kafka

    Big Data SMACK: A Guide to Apache Spark, Mesos, Akka, Cassandra, and Kafka by Raul Estrada, Isaac Ruiz English | ISBN: 1484221745 | 2016 | EPUB | 264 pages | 2.35 MB This book is about how to ...

    Spark Cluster Computing with Data Set

    Spark的集群计算模式包括本地模式、standalone模式、Mesos模式和YARN模式。在这些模式下,Spark可以跨多台机器并行处理数据,提升计算效率。Cluster Manager负责分配资源,Executor负责执行任务,Driver负责协调...

    apache-mesos-cookbook

    Integrate Mesos with Spark and other big data frameworks Use networking features in Mesos for effective communication between containers Configure Mesos for high availability using Zookeeper Secure ...

    Apache Mesos Essentials(PACKT,2015)

    With Mesos, you have the power to manage a wide range of resources in a multi-tenant environment. Starting with the basics, this book will give you an insight into all the features that Mesos has to...

    Frank Kane's Taming Big Data with Apache Spark and Python

    这本书《Taming Big Data with Apache Spark and Python》由Frank Kane所著,主要讲解了如何使用Apache Spark和Python来分析大规模数据集,并提供了真实的案例帮助读者理解和实践。Apache Spark是一个开源的分布式...

    SparkCore源码阅读

    4. **灵活性和扩展性**:Spark on Mesos 模式能够更好地适应不断变化的工作负载需求,同时支持多租户共享资源,提高了系统的整体效率。 #### Spark on YARN 部署方式 Spark on YARN 是指 Spark 运行在 Hadoop YARN...

    Mesos实战 中文+英文

    Mesos的核心理念是资源抽象和共享,它能够有效地在集群中分配计算和存储资源,为上层框架如Hadoop、Spark、Kubernetes等提供统一的资源管理层。通过Mesos,开发者可以轻松地构建容错、弹性的应用,同时减少运维的...

    Spark-Core学习知识笔记整理

    2.5Spark-on-Mesos模式配置 13 2.6Hive-on-Spark配置 13 第三章 Spark计算模型 15 1 RDD编程 15 1.1弹性分布式数据集RDD 15 1.2构建RDD对象 15 2RDD操作 15 2.1将函数传递给Spark 16 2.2了解闭包 16 2.3Pair RDD模型...

    数据处理平台架构中的SMACK组合:Spark、Mesos、Akka、Cassandra以及Kafka

    摘要:在今天的文章中,我们将着重探讨如何利用SMACK(即Spark、Mesos、Akka、Cassandra以及Kafka)堆栈构建可扩展数据处理平台。虽然这套堆栈仅由数个简单部分组成,但其能够实现大量不同系统设计。除了纯粹的批量...

    docker-mesos-spark-shell:用于在Mesos集群上创建Spark Shell的Docker映像

    Mesos Spark Shell 用于在Mesos集群上创建Spark 1.4.1 shell的docker镜像。 目前,通过.deb安装程序支持Mesos 0.23.0。跑步要查看如何运行Docker映像,请查看run_spark_shell.sh文件。 基本上,您只需要用Mesos ...

Global site tag (gtag.js) - Google Analytics