无法找到或加载 main class Spark Docker

Could not find or load main class Spark Docker

我构建了 2 个具有不同主要 classes 的独立 jar 文件 - KafkaCheckinsProducer 和 SparkConsumer,它们都是具有主要方法的对象。在 bash 脚本中,我启动了一个带有参数的 jar 文件。我有一个启动此 bash 脚本的 Dockerfile。我用这个命令启动我的 Dockerfile:

docker run -v myvolume:/workdir built-image-name 

然后我收到这样一条错误消息:

Error: Could not find or load main class consumer.SparkConsumer

什么可能导致此错误以及如何修复我的 Dockerfile 或 build.sbt?

这是我的 Dockerfile:

FROM java:8
ARG ARG_CLASS
ENV MAIN_CLASS $ARG_CLASS
ENV SCALA_VERSION 2.11.8
ENV SBT_VERSION 1.1.1
ENV SPARK_VERSION 2.2.0
ENV SPARK_DIST spark-$SPARK_VERSION-bin-hadoop2.6
ENV SPARK_ARCH $SPARK_DIST.tgz

WORKDIR /opt

# Install Scala
RUN \
  cd /root && \
  curl -o scala-$SCALA_VERSION.tgz http://downloads.typesafe.com/scala/$SCALA_VERSION/scala-$SCALA_VERSION.tgz && \
  tar -xf scala-$SCALA_VERSION.tgz && \
  rm scala-$SCALA_VERSION.tgz && \
  echo >> /root/.bashrc && \
  echo 'export PATH=~/scala-$SCALA_VERSION/bin:$PATH' >> /root/.bashrc

# Install SBT
RUN \
  curl -L -o sbt-$SBT_VERSION.deb https://dl.bintray.com/sbt/debian/sbt-$SBT_VERSION.deb && \
  dpkg -i sbt-$SBT_VERSION.deb && \
  rm sbt-$SBT_VERSION.deb


# Install Spark
RUN \
    cd /opt && \
    curl -o $SPARK_ARCH http://d3kbcqa49mib13.cloudfront.net/$SPARK_ARCH && \
    tar xvfz $SPARK_ARCH && \
    rm $SPARK_ARCH && \
    echo 'export PATH=$SPARK_DIST/bin:$PATH' >> /root/.bashrc


EXPOSE 9851 9852 4040 9092 9200 9300 5601 7474 7687 7473

VOLUME /workdir

CMD /workdir/runDemo.sh "$MAIN_CLASS" 

Bash 脚本如下所示:

#!/usr/bin/env bash
if [ "" = "consumer" ]
then
    java -cp "target/scala-2.11/demo_consumer.jar" consumer.SparkConsumer   
elif [ "" = "producer" ]
then
    java -cp "target/scala-2.11/full_demo_producer.jar" producer.KafkaCheckinsProducer    
else
    echo "Wrong parameter. It should be consumer or producer, but it is "
fi

这是一个 build.sbt,我通过更改主要 class 名称和 jar 名称从中构建了两个 jar:

name := "DemoBuildTest"
version := "0.1"
scalaVersion := "2.11.8"

assemblyJarName in assembly := "demo_producer.jar"
mainClass in assembly := Some("producer.KafkaCheckinsProducer")

val sparkVersion = "2.2.0"
resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"


dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-core" % "2.9.5"
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.9.5"
dependencyOverrides += "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.9.5"

libraryDependencies ++= Seq(
  "org.apache.kafka" %% "kafka" % "1.1.0",
  "org.apache.spark" %% "spark-core" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkVersion,
  "com.typesafe" % "config" % "1.3.0",
  "org.neo4j.driver" % "neo4j-java-driver" % "1.5.1",
  "com.opencsv" % "opencsv" % "4.1",
  "com.databricks" %% "spark-csv" % "1.5.0",
  "com.github.tototoshi" %% "scala-csv" % "1.3.5",
  "org.elasticsearch" %% "elasticsearch-spark-20" % "6.2.4"
)

assemblyMergeStrategy in assembly := {
  case PathList("org","aopalliance", xs @ _*) => MergeStrategy.last
  case PathList("javax", "inject", xs @ _*) => MergeStrategy.last
  case PathList("javax", "servlet", xs @ _*) => MergeStrategy.last
  case PathList("javax", "activation", xs @ _*) => MergeStrategy.last
  case PathList("org", "apache", xs @ _*) => MergeStrategy.last
  case PathList("org", "slf4j", xs @ _*) => MergeStrategy.last
  case PathList("org", "neo4j", xs @ _*) => MergeStrategy.last
  case PathList("com", "google", xs @ _*) => MergeStrategy.last
  case PathList("com", "esotericsoftware", xs @ _*) => MergeStrategy.last
  case PathList("com", "codahale", xs @ _*) => MergeStrategy.last
  case PathList("com", "yammer", xs @ _*) => MergeStrategy.last
  case PathList("net", "jpountz", xs @ _*) => MergeStrategy.last
  case PathList("META-INF", xs @ _*) => MergeStrategy.discard
  case "about.html" => MergeStrategy.rename
  case "META-INF/ECLIPSEF.RSA" => MergeStrategy.last
  case "META-INF/mailcap" => MergeStrategy.last
  case "META-INF/mimetypes.default" => MergeStrategy.last
  case "plugin.properties" => MergeStrategy.last
  case "log4j.properties" => MergeStrategy.last
  case x =>
    val oldStrategy = (assemblyMergeStrategy in assembly).value
    oldStrategy(x)
}
  1. 检查jar
  2. 的主要class
  3. 在 Docker 文件中,您在 构建时 声明了 MAIN_CLASS=consumer,我认为您需要在运行时使用此环境 "dynamic",因此删除它来自 Docker 文件,或使用 build-arg 构建 2 个 Docker 图像:消费者和生产者。