如何在 Spark 流程序中解决 "Error running job streaming job"?

How to resolved "Error running job streaming job" in Spark streaming program?

我在 Spark 上创建了一个 Master 和一个 worker。然后我创建了一个 Spark 流式作业并尝试提交它,但在 Master 上它显示一长串 java 错误

使用此命令启动主机:

用它来启动工作器:

为了提交 spark 作业,尝试了不同参数的命令:

Logs after Submitting the job are:

WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/C:/Data/Softwares/spark-2.4.3-bin-hadoop2.6/jars/spark-unsafe_2.11-2.4.3.jar) to method java.nio.Bits.unaligned()
        WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
      WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
    WARNING: All illegal access operations will be denied in a future release
    SparkHome
    19/08/01 15:16:06 INFO SparkContext: Running Spark version 2.4.3
    19/08/01 15:16:06 INFO SparkContext: Submitted application: NetworkWordCount
    19/08/01 15:16:06 INFO SecurityManager: Changing view acls to: SachdeJ
    19/08/01 15:16:06 INFO SecurityManager: Changing modify acls to: SachdeJ
    19/08/01 15:16:06 INFO SecurityManager: Changing view acls groups to:
    19/08/01 15:16:06 INFO SecurityManager: Changing modify acls groups to:
    19/08/01 15:16:06 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(SachdeJ); groups with view permissions: Set(); users  with modify permissions: Set(SachdeJ); groups with modify permissions: Set()
    19/08/01 15:16:07 INFO Utils: Successfully started service 'sparkDriver' on port 54035.
    19/08/01 15:16:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
    19/08/01 15:16:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
    19/08/01 15:16:07 INFO DiskBlockManager: Created local directory at C:\Users\sachdej\AppData\Local\Temp\blockmgr-cccaa7a1-cfdc-45fb-8ff5-9a800def11ff
    19/08/01 15:16:07 INFO MemoryStore: MemoryStore started with capacity 434.4 MB
    19/08/01 15:16:08 INFO Utils: Successfully started service 'SparkUI' on port 4040.
    19/08/01 15:16:08 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://GGN1037742.bsci.bossci.com:4040
    19/08/01 15:16:08 INFO SparkContext: Added JAR file:/C:/Data/SharkTank/com.rba.boston/target/RBA-jar-with-dependencies.jar at spark://GGN1037742.bsci.bossci.com:54035/jars/RBA-jar-with-dependencies.jar with timestamp 1564652768180
    19/08/01 15:16:08 INFO Executor: Starting executor ID driver on host localhost
    19/08/01 15:16:08 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 54057.
    19/08/01 15:16:08 INFO NettyBlockTransferService: Server created on GGN1037742.bsci.bossci.com:54057
    19/08/01 15:16:08 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
    19/08/01 15:16:08 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, GGN1037742.bsci.bossci.com, 54057, None)
    19/08/01 15:16:08 INFO BlockManagerMasterEndpoint: Registering block manager GGN1037742.bsci.bossci.com:54057 with 434.4 MB RAM, BlockManagerId(driver, GGN1037742.bsci.bossci.com, 54057, None)
    19/08/01 15:16:08 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, GGN1037742.bsci.bossci.com, 54057, None)
    19/08/01 15:16:08 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, GGN1037742.bsci.bossci.com, 54057, None)
    19/08/01 15:16:08 INFO ReceiverTracker: Starting 1 receivers
    19/08/01 15:16:08 INFO ReceiverTracker: ReceiverTracker started
    19/08/01 15:16:08 INFO SocketInputDStream: Slide time = 3000 ms
    19/08/01 15:16:08 INFO SocketInputDStream: Storage level = Serialized 1x Replicated
    19/08/01 15:16:08 INFO SocketInputDStream: Checkpoint interval = null
    19/08/01 15:16:08 INFO SocketInputDStream: Remember interval = 3000 ms
    19/08/01 15:16:08 INFO SocketInputDStream: Initialized and validated org.apache.spark.streaming.dstream.SocketInputDStream@628a467a
    19/08/01 15:16:08 INFO FlatMappedDStream: Slide time = 3000 ms
    19/08/01 15:16:08 INFO FlatMappedDStream: Storage level = Serialized 1x Replicated
    19/08/01 15:16:08 INFO FlatMappedDStream: Checkpoint interval = null
    19/08/01 15:16:08 INFO FlatMappedDStream: Remember interval = 3000 ms
    19/08/01 15:16:08 INFO FlatMappedDStream: Initialized and validated org.apache.spark.streaming.dstream.FlatMappedDStream@4013fae0
    19/08/01 15:16:08 INFO MappedDStream: Slide time = 3000 ms
    19/08/01 15:16:08 INFO MappedDStream: Storage level = Serialized 1x Replicated
    19/08/01 15:16:08 INFO MappedDStream: Checkpoint interval = null
    19/08/01 15:16:08 INFO MappedDStream: Remember interval = 3000 ms
    19/08/01 15:16:08 INFO MappedDStream: Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@2e5a119c
    19/08/01 15:16:08 INFO ShuffledDStream: Slide time = 3000 ms
    19/08/01 15:16:08 INFO ShuffledDStream: Storage level = Serialized 1x Replicated
    19/08/01 15:16:08 INFO ShuffledDStream: Checkpoint interval = null
    19/08/01 15:16:08 INFO ShuffledDStream: Remember interval = 3000 ms
    19/08/01 15:16:08 INFO ShuffledDStream: Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream@2141cddb
    19/08/01 15:16:08 INFO ForEachDStream: Slide time = 3000 ms
    19/08/01 15:16:08 INFO ForEachDStream: Storage level = Serialized 1x Replicated
    19/08/01 15:16:08 INFO ForEachDStream: Checkpoint interval = null
    19/08/01 15:16:08 INFO ForEachDStream: Remember interval = 3000 ms
    19/08/01 15:16:08 INFO ForEachDStream: Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@1148f61f
    19/08/01 15:16:08 INFO RecurringTimer: Started timer for JobGenerator at time 1564652769000
    19/08/01 15:16:08 INFO JobGenerator: Started JobGenerator at 1564652769000 ms
    19/08/01 15:16:08 INFO JobScheduler: Started JobScheduler
    19/08/01 15:16:08 INFO ReceiverTracker: Receiver 0 started
    19/08/01 15:16:08 INFO StreamingContext: StreamingContext started
    19/08/01 15:16:08 INFO DAGScheduler: Got job 0 (start at SparkHome.java:40) with 1 output partitions
    19/08/01 15:16:08 INFO DAGScheduler: Final stage: ResultStage 0 (start at SparkHome.java:40)
    19/08/01 15:16:08 INFO DAGScheduler: Parents of final stage: List()
    19/08/01 15:16:08 INFO DAGScheduler: Missing parents: List()
    19/08/01 15:16:09 INFO DAGScheduler: Submitting ResultStage 0 (Receiver 0 ParallelCollectionRDD[0] at makeRDD at ReceiverTracker.scala:614), which has no missing parents
    19/08/01 15:16:09 INFO JobScheduler: Added jobs for time 1564652769000 ms
    19/08/01 15:16:09 INFO JobScheduler: Starting job streaming job 1564652769000 ms.0 from job set of time 1564652769000 ms
    19/08/01 15:16:09 INFO JobScheduler: Finished job streaming job 1564652769000 ms.0 from job set of time 1564652769000 ms
    19/08/01 15:16:09 ERROR JobScheduler: Error running job streaming job 1564652769000 ms.0
    java.lang.IllegalArgumentException: Unsupported class file major version 56
            at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
            at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
            at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
            at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
            at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
            at org.apache.spark.util.FieldAccessFinder$$anon$$anonfun$visitMethodInsn.apply(ClosureCleaner.scala:517)
            at org.apache.spark.util.FieldAccessFinder$$anon$$anonfun$visitMethodInsn.apply(ClosureCleaner.scala:500)
            at scala.collection.TraversableLike$WithFilter$$anonfun$foreach.apply(TraversableLike.scala:733)
            at scala.collection.mutable.HashMap$$anon$$anonfun$foreach.apply(HashMap.scala:134)
            at scala.collection.mutable.HashMap$$anon$$anonfun$foreach.apply(HashMap.scala:134)
            at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
            at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
            at scala.collection.mutable.HashMap$$anon.foreach(HashMap.scala:134)
            at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
            at org.apache.spark.util.FieldAccessFinder$$anon.visitMethodInsn(ClosureCleaner.scala:500)
            at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
            at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
            at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
            at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
            at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean.apply(ClosureCleaner.scala:307)
            at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean.apply(ClosureCleaner.scala:306)
            at scala.collection.immutable.List.foreach(List.scala:392)
            at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
            at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
            at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
            at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
            at org.apache.spark.rdd.RDD$$anonfun$take.apply(RDD.scala:1364)
            at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
            at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
            at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
            at org.apache.spark.rdd.RDD.take(RDD.scala:1337)
            at org.apache.spark.streaming.dstream.DStream$$anonfun$print$$anonfun$foreachFunc.apply(DStream.scala:735)
            at org.apache.spark.streaming.dstream.DStream$$anonfun$print$$anonfun$foreachFunc.apply(DStream.scala:734)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$$anonfun$apply$mcV$sp.apply$mcV$sp(ForEachDStream.scala:51)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$$anonfun$apply$mcV$sp.apply(ForEachDStream.scala:51)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$$anonfun$apply$mcV$sp.apply(ForEachDStream.scala:51)
            at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun.apply$mcV$sp(ForEachDStream.scala:50)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun.apply(ForEachDStream.scala:50)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun.apply(ForEachDStream.scala:50)
            at scala.util.Try$.apply(Try.scala:192)
            at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
            at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run.apply$mcV$sp(JobScheduler.scala:257)
            at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run.apply(JobScheduler.scala:257)
            at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run.apply(JobScheduler.scala:257)
            at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
            at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
            at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
            at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
            at java.base/java.lang.Thread.run(Thread.java:835)
    Exception in thread "main" java.lang.IllegalArgumentException: Unsupported class file major version 56
            at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
            at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
            at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
            at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
            at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
            at org.apache.spark.util.FieldAccessFinder$$anon$$anonfun$visitMethodInsn.apply(ClosureCleaner.scala:517)
            at org.apache.spark.util.FieldAccessFinder$$anon$$anonfun$visitMethodInsn.apply(ClosureCleaner.scala:500)
            at scala.collection.TraversableLike$WithFilter$$anonfun$foreach.apply(TraversableLike.scala:733)
            at scala.collection.mutable.HashMap$$anon$$anonfun$foreach.apply(HashMap.scala:134)
            at scala.collection.mutable.HashMap$$anon$$anonfun$foreach.apply(HashMap.scala:134)
            at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
            at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
            at scala.collection.mutable.HashMap$$anon.foreach(HashMap.scala:134)
            at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
            at org.apache.spark.util.FieldAccessFinder$$anon.visitMethodInsn(ClosureCleaner.scala:500)
            at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
            at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
            at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
            at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
            at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean.apply(ClosureCleaner.scala:307)
            at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean.apply(ClosureCleaner.scala:306)
            at scala.collection.immutable.List.foreach(List.scala:392)
            at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
            at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
            at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
            at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
            at org.apache.spark.rdd.RDD$$anonfun$take.apply(RDD.scala:1364)
            at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
            at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
            at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
            at org.apache.spark.rdd.RDD.take(RDD.scala:1337)
            at org.apache.spark.streaming.dstream.DStream$$anonfun$print$$anonfun$foreachFunc.apply(DStream.scala:735)
            at org.apache.spark.streaming.dstream.DStream$$anonfun$print$$anonfun$foreachFunc.apply(DStream.scala:734)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$$anonfun$apply$mcV$sp.apply$mcV$sp(ForEachDStream.scala:51)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$$anonfun$apply$mcV$sp.apply(ForEachDStream.scala:51)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$$anonfun$apply$mcV$sp.apply(ForEachDStream.scala:51)
            at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun.apply$mcV$sp(ForEachDStream.scala:50)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun.apply(ForEachDStream.scala:50)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun.apply(ForEachDStream.scala:50)
            at scala.util.Try$.apply(Try.scala:192)
            at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
            at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run.apply$mcV$sp(JobScheduler.scala:257)
            at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run.apply(JobScheduler.scala:257)
            at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run.apply(JobScheduler.scala:257)
            at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
            at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
            at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
            at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
            at java.base/java.lang.Thread.run(Thread.java:835)
    19/08/01 15:16:09 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 46.6 KB, free 434.4 MB)
    19/08/01 15:16:09 INFO StreamingContext: Invoking stop(stopGracefully=false) from shutdown hook
    19/08/01 15:16:09 INFO ReceiverTracker: Sent stop signal to all 1 receivers
    19/08/01 15:16:09 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 16.0 KB, free 434.3 MB)
    19/08/01 15:16:09 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on GGN1037742.bsci.bossci.com:54057 (size: 16.0 KB, free: 434.4 MB)
    19/08/01 15:16:09 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1161
    19/08/01 15:16:09 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (Receiver 0 ParallelCollectionRDD[0] at makeRDD at ReceiverTracker.scala:614) (first 15 tasks are for partitions Vector(0))
    19/08/01 15:16:09 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
    19/08/01 15:16:09 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 8459 bytes)
    19/08/01 15:16:09 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
    19/08/01 15:16:09 INFO Executor: Fetching spark://GGN1037742.bsci.bossci.com:54035/jars/RBA-jar-with-dependencies.jar with timestamp 1564652768180
    19/08/01 15:16:09 INFO TransportClientFactory: Successfully created connection to GGN1037742.bsci.bossci.com/10.220.45.105:54035 after 31 ms (0 ms spent in bootstraps)
    19/08/01 15:16:09 INFO Utils: Fetching spark://GGN1037742.bsci.bossci.com:54035/jars/RBA-jar-with-dependencies.jar to C:\Users\sachdej\AppData\Local\Temp\spark-5b831314-0918-401a-88d8-1c6a135d554c\userFiles-99183ed0-5ef0-4d7d-8ea1-8e928e026847\fetchFileTemp6554652685231905365.tmp
    19/08/01 15:16:09 INFO Executor: Adding file:/C:/Users/sachdej/AppData/Local/Temp/spark-5b831314-0918-401a-88d8-1c6a135d554c/userFiles-99183ed0-5ef0-4d7d-8ea1-8e928e026847/RBA-jar-with-dependencies.jar to class loader
    19/08/01 15:16:09 INFO RecurringTimer: Started timer for BlockGenerator at time 1564652770000
    19/08/01 15:16:09 INFO BlockGenerator: Started BlockGenerator
    19/08/01 15:16:09 INFO BlockGenerator: Started block pushing thread
    19/08/01 15:16:09 INFO ReceiverSupervisorImpl: Stopping receiver with message: Registered unsuccessfully because Driver refused to start receiver 0:
    19/08/01 15:16:09 WARN ReceiverSupervisorImpl: Skip stopping receiver because it has not yet stared
    19/08/01 15:16:09 INFO BlockGenerator: Stopping BlockGenerator
    19/08/01 15:16:10 INFO RecurringTimer: Stopped timer for BlockGenerator after time 1564652770200
    19/08/01 15:16:10 INFO BlockGenerator: Waiting for block pushing thread to terminate
    19/08/01 15:16:10 INFO BlockGenerator: Pushing out the last 0 blocks
    19/08/01 15:16:10 INFO BlockGenerator: Stopped block pushing thread
    19/08/01 15:16:10 INFO BlockGenerator: Stopped BlockGenerator
    19/08/01 15:16:10 INFO ReceiverSupervisorImpl: Waiting for receiver to be stopped
    19/08/01 15:16:10 INFO ReceiverSupervisorImpl: Stopped receiver without error
    19/08/01 15:16:10 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 794 bytes result sent to driver
    19/08/01 15:16:10 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 965 ms on localhost (executor driver) (1/1)
    19/08/01 15:16:10 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
    19/08/01 15:16:10 INFO DAGScheduler: ResultStage 0 (start at SparkHome.java:40) finished in 1.229 s
    19/08/01 15:16:10 INFO ReceiverTracker: All of the receivers have deregistered successfully
    19/08/01 15:16:10 INFO ReceiverTracker: ReceiverTracker stopped
    19/08/01 15:16:10 INFO JobGenerator: Stopping JobGenerator immediately
    19/08/01 15:16:10 INFO RecurringTimer: Stopped timer for JobGenerator after time 1564652769000
    19/08/01 15:16:10 INFO JobGenerator: Stopped JobGenerator
    19/08/01 15:16:10 INFO JobScheduler: Stopped JobScheduler
    19/08/01 15:16:10 INFO StreamingContext: StreamingContext stopped successfully
    19/08/01 15:16:10 INFO SparkContext: Invoking stop() from shutdown hook
    19/08/01 15:16:10 INFO SparkUI: Stopped Spark web UI at http://GGN1037742.bsci.bossci.com:4040
    19/08/01 15:16:10 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
    19/08/01 15:16:10 INFO MemoryStore: MemoryStore cleared
    19/08/01 15:16:10 INFO BlockManager: BlockManager stopped
    19/08/01 15:16:10 INFO BlockManagerMaster: BlockManagerMaster stopped
    19/08/01 15:16:10 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
    19/08/01 15:16:10 INFO SparkContext: Successfully stopped SparkContext






    Master Logs after submitting the job are:
    19/08/01 14:03:10 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() for one-way message.
    java.io.InvalidClassException: org.apache.spark.rpc.netty.NettyRpcEndpointRef; local class incompatible: stream classdesc serialVersionUID = -4186747031772874359, local class serialVersionUID = 6257082371135760434
            at java.base/java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:689)
            at java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1903)
            at java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1772)
            at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
            at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
            at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2355)
            at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2249)
            at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2087)
            at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
            at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:430)
            at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
            at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
            at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$$anonfun$apply.apply(NettyRpcEnv.scala:271)
            at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
            at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:320)
            at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize.apply(NettyRpcEnv.scala:270)
            at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
            at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:269)
            at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:611)
            at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:662)
            at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:654)
            at org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:274)
            at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:105)19/08/01 14:03:10 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() for one-way message.
    java.io.InvalidClassException: org.apache.spark.rpc.netty.NettyRpcEndpointRef; local class incompatible: stream classdesc serialVersionUID = -4186747031772874359, local class serialVersionUID = 6257082371135760434
            at java.base/java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:689)
            at java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1903)
            at java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1772)
            at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
            at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
            at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2355)
            at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2249)
            at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2087)
            at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
            at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:430)
            at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
            at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
            at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$$anonfun$apply.apply(NettyRpcEnv.scala:271)
            at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
            at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:320)
            at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize.apply(NettyRpcEnv.scala:270)
            at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
            at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:269)
            at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:611)
            at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:662)
            at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:654)
            at org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:274)
            at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:105)19/08/01 14:03:10 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() for one-way message.
    java.io.InvalidClassException: org.apache.spark.rpc.netty.NettyRpcEndpointRef; local class incompatible: stream classdesc serialVersionUID = -4186747031772874359, local class serialVersionUID = 6257082371135760434
            at java.base/java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:689)
            at java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1903)
            at java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1772)
            at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
            at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)

类似

的错误
java.lang.IllegalArgumentException: Unsupported class file major version 56

意味着 class 是在比我们尝试 运行 它的版本更高的 Java 版本上编译的。

v56表示你用的是Java12编译的,不知道你的Spark用的是什么版本的Java。根据这张票,看起来 Spark 甚至与 Java 11 不兼容:https://issues.apache.org/jira/browse/SPARK-24417

解决方案是使用以前版本的 jdk 编译您的代码。