Yarn - 火花工作的执行者

Yarn - executor for spark job

Process spark = new SparkLauncher()
.setAppResource("myApp.jar")
.setMainClass("com.aa.bb.app")
.setMaster("yarn")
.setDeployMode( "cluster")
.addAppArgs( data)
.launch();

这就是我将 spark jar 执行到 yarn cluster 的方式。下面是一些问题。

  1. Is this processing with a executor? ( 1 spark-submit per 1 yarn executor?)
  2. How should I executed multi spark job concurrently? (Where should I set dynamic allocation(spark.dynamicAllocation.enabled)?)
  3. Where should I set number of executor configuration? In java code? In yarn xml?
  4. If I set number of executor as 2, and process single job, one of executor will do nothing?

不需要为此做任何事情。自动分配。