with enabling both system environment 'SPARK_PRINT_LAUNCH_COMMAND' and --verbose ,the spark command is more detailed that outputed from spark-submit.sh:
hadoop@GZsw04:~/spark/spark-1.4.1-bin-hadoop2.4$ spark-submit --master yarn --verbose --class org.apache.spark.examples.JavaWordCount lib/spark-examples-1.4.1-hadoop2.4.0-my.jar RELEASE
Spark Command: /usr/local/jdk/jdk1.6.0_31/bin/java -cp /home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/conf/:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/spark-assembly-1.4.1-hadoop2.4.0.jar:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar:/usr/local/hadoop/hadoop-2.5.2/etc/hadoop/ -Xms6g -Xmx6g -XX:MaxPermSize=256m org.apache.spark.deploy.SparkSubmit --master yarn --class org.apache.spark.examples.JavaWordCount --verbose lib/spark-examples-1.4.1-hadoop2.4.0-my.jar RELEASE
========================================
Using properties file: /home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/conf/spark-defaults.conf
Adding default property: spark.executor.extraJavaOptions=-Xloggc:~/spark-executor.gc -XX:+UseCMSCompactAtFullCollection -XX:CMSFullGCsBeforeCompaction=2 -XX:CMSInitiatingOccupancyFraction=65 -XX:+UseCMSInitiatingOccupancyOnly -XX:PermSize=64m -XX:MaxPermSize=256m -XX:NewRatio=5 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:ParallelGCThreads=5
Adding default property: spark.eventLog.enabled=true
Adding default property: spark.ui.port=7106
Adding default property: spark.cores.max=50
Adding default property: spark.storage.memoryFraction=0.5
Adding default property: spark.driver.memory=6g
Adding default property: spark.worker.ui.port=7105
Adding default property: spark.master.ui.port=7102
Adding default property: spark.executor.memory=2g
Adding default property: spark.eventLog.dir=/home/hadoop/spark/spark-eventlog
Adding default property: spark.executor.cores=2
Adding default property: spark.driver.allowMultipleContexts=true
Parsed arguments:
master yarn
deployMode null
executorMemory 2g
executorCores 2
totalExecutorCores 50
propertiesFile /home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/conf/spark-defaults.conf
driverMemory 6g
driverCores null
driverExtraClassPath null
driverExtraLibraryPath null
driverExtraJavaOptions null
supervise false
queue null
numExecutors null
files null
pyFiles null
archives null
mainClass org.apache.spark.examples.JavaWordCount
primaryResource file:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/spark-examples-1.4.1-hadoop2.4.0-my.jar
name org.apache.spark.examples.JavaWordCount
childArgs [RELEASE]
jars null
packages null
repositories null
verbose true
Spark properties used, including those specified through
--conf and those from the properties file /home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/conf/spark-defaults.conf:
spark.driver.memory -> 6g
spark.executor.memory -> 2g
spark.eventLog.enabled -> true
spark.driver.allowMultipleContexts -> true
spark.cores.max -> 50
spark.ui.port -> 7106
spark.executor.extraJavaOptions -> -Xloggc:~/spark-executor.gc -XX:+UseCMSCompactAtFullCollection -XX:CMSFullGCsBeforeCompaction=2 -XX:CMSInitiatingOccupancyFraction=65 -XX:+UseCMSInitiatingOccupancyOnly -XX:PermSize=64m -XX:MaxPermSize=256m -XX:NewRatio=5 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:ParallelGCThreads=5
spark.eventLog.dir -> /home/hadoop/spark/spark-eventlog
spark.worker.ui.port -> 7105
spark.storage.memoryFraction -> 0.5
spark.master.ui.port -> 7102
spark.executor.cores -> 2
Main class:
org.apache.spark.examples.JavaWordCount
Arguments:
RELEASE
System properties:
spark.driver.memory -> 6g
spark.executor.memory -> 2g
spark.eventLog.enabled -> true
spark.driver.allowMultipleContexts -> true
spark.cores.max -> 50
SPARK_SUBMIT -> true
spark.ui.port -> 7106
spark.executor.extraJavaOptions -> -Xloggc:~/spark-executor.gc -XX:+UseCMSCompactAtFullCollection -XX:CMSFullGCsBeforeCompaction=2 -XX:CMSInitiatingOccupancyFraction=65 -XX:+UseCMSInitiatingOccupancyOnly -XX:PermSize=64m -XX:MaxPermSize=256m -XX:NewRatio=5 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:ParallelGCThreads=5
spark.app.name -> org.apache.spark.examples.JavaWordCount
spark.jars -> file:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/spark-examples-1.4.1-hadoop2.4.0-my.jar
spark.eventLog.dir -> /home/hadoop/spark/spark-eventlog
spark.worker.ui.port -> 7105
spark.master -> yarn-client
spark.executor.cores -> 2
spark.master.ui.port -> 7102
spark.storage.memoryFraction -> 0.5
Classpath elements:
file:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/spark-examples-1.4.1-hadoop2.4.0-my.jar
相关推荐
mapping_mid360.launch
Cramming_for_FISMA_How_to_Launch_a_NIST_800_53_Moderate_System_in_180_Days 业务风控 安全架构业务风控 企业安全 APT
-> smartcar_description/smartcar_gazebo.launch -> waypoint_loader/waypoint_loader.launch -> waypoint_updater/waypoint_updater.launch -> stanley_persuit/stanley_persuit.launch Pure_persuit算法: -> ...
* -> smartcar_description/smartcar_gazebo.launch * -> waypoint_loader/waypoint_loader.launch * -> waypoint_updater/waypoint_updater.launch * -> stanley_persuit/stanley_persuit.launch - Pure_...
在尝试启动Spark Shell时,你遇到了一个特定的错误,该错误与Spark的内存配置有关。在你的描述中,错误信息明确指出系统内存259522560字节(约245MB)必须至少为471859200字节(约448MB),这意味着Spark在启动时...
《Bat-To-Exe-Converter:JavaFX将批处理文件转换为可执行程序》 在IT行业中,有时候我们需要将批处理脚本(.bat文件)转换为可执行程序(.exe文件),以便在不同的计算机环境中方便地运行。Bat-To-Exe-Converter是...
The ballistic category of atmospheric vehicles includes missiles, launch ve- hicles, and entry capsules. Some missiles and launch vehicles incorporate fi ns as aerodynamic stabilizing and ...
这是一个关于履带车的控制程序,包括上位机(ROS)和下位机arduino代码,以及对激光雷达、IMU、kinect v2.0、GPS等传感器信号的数据读取。...4、IMU元件驱动程序----roslaunch razor_imu_9dof razor-pub.launch
此文件用于双目视觉与IMU标定后的结果对VINS-Fusion进行运行 解决博客可能的跑飞问题: roslaunch realsense2_...rosrun vins vins_node src/VINS-Fusion/config/realsense_d435i/realsense_stereo_imu_config_my.yaml
The development of high-quality permanent magnet materials into commercial production ...encouraged several manufacturers to launch various permanent magnet synchronous machines (PMSM) into the market.
- `mmi_dialer_launch_ex()`:扩展版本的启动函数,提供了更多配置选项。 - `mmi_dialer_launch()`:标准版本的启动函数,用于启动拨号器。 启动过程中涉及的其他函数: - `srv_nativeappfactory_launch_ex()`:...
1_2_1_pointcloud_based_occupancy_gride_map.launch.py
multi_aruco_det.launch
demo_revo_lds.launch
基于ROS和arduino实现的履带车控制程序完整代码(上位机+下位机)...3、惯性元件GPS驱动程序---roslaunch nmea_navsat_driver nmea_serial_driver.launch 4、IMU元件驱动程序----roslaunch razor_imu_9dof razor-pub.laun
output them on screen and to printer. TPrinterPreview have all functions and properties as in TPrinter object. You can use TPrinterPreview object similarly of TPrinter except some details. In ...
在Spark大数据处理框架中,Driver的角色至关重要,它负责协调整个计算过程,生成Jobs并调度Tasks。本篇文章将深入探讨Spark Driver的工作机制,以及如何生成Jobs并启动Tasks。 首先,我们来理解Spark作业(Job)与...
xacro_display_scara_rt.launch 运行此启动文件以进行仅用于 SCARA 的运动学模拟 roslaunch scara_v2_with_gripper xacro_display_scara.launch 运行此启动文件以进行仅用于旋转台的运动学模拟 roslaunch scara_v2_...
ROS三角洲机器人。 delta_robot_support包含全局启动文件,URDF和网格的项目。 delta_robot_kinematics包运动学库 delta_robot_img_processor ROS图像...roslaunch delta_robot_support delta_robot_real.launch
smartcar_description/smartcar_gazebo.launch waypoint_loader/waypoint_loader.launch waypoint_updater/waypoint_updater.launch stanley_persuit/stanley_persuit.launch Pure_persuit算法: smartcar_...