Yarn - 火花工作的执行者
Yarn - executor for spark job
Process spark = new SparkLauncher()
.setAppResource("myApp.jar")
.setMainClass("com.aa.bb.app")
.setMaster("yarn")
.setDeployMode( "cluster")
.addAppArgs( data)
.launch();
这就是我将 spark jar 执行到 yarn cluster 的方式。下面是一些问题。
- Is this processing with a executor? ( 1 spark-submit per 1 yarn executor?)
- How should I executed multi spark job concurrently? (Where should I set dynamic allocation(spark.dynamicAllocation.enabled)?)
- Where should I set number of executor configuration? In java code? In yarn xml?
- If I set number of executor as 2, and process single job, one of executor will do nothing?
不需要为此做任何事情。自动分配。
Process spark = new SparkLauncher()
.setAppResource("myApp.jar")
.setMainClass("com.aa.bb.app")
.setMaster("yarn")
.setDeployMode( "cluster")
.addAppArgs( data)
.launch();
这就是我将 spark jar 执行到 yarn cluster 的方式。下面是一些问题。
- Is this processing with a executor? ( 1 spark-submit per 1 yarn executor?)
- How should I executed multi spark job concurrently? (Where should I set dynamic allocation(spark.dynamicAllocation.enabled)?)
- Where should I set number of executor configuration? In java code? In yarn xml?
- If I set number of executor as 2, and process single job, one of executor will do nothing?
不需要为此做任何事情。自动分配。