Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/StreamingContext
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/StreamingContext
大家好,下面的代码中似乎没有找到 class StreamingContext。
import org.apache.spark.streaming.{Seconds, StreamingContext}
import org.apache.spark.{SparkConf, SparkContext}
object Exemple {
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setMaster("local[*]").setAppName("Exemple")
val sc = new SparkContext(conf)
val ssc = new StreamingContext(sc, Seconds(2)) //this line throws error
}
}
这是错误:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/StreamingContext
at Exemple$.main(Exemple.scala:16)
at Exemple.main(Exemple.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.StreamingContext
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 2 more
Process finished with exit code 1
我使用以下 build.sbt 文件:
name := "exemple"
version := "1.0.0"
scalaVersion := "2.11.11"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.2.0" % "provided"
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.2.0"
我 运行 使用 intellij 运行 按钮的示例 class 我得到了错误。在 sbt shell 它工作正常。进入我的依赖模块,我可以找到 spark 依赖项。代码在 intellij 中编译。我可以在外部库中看到火花依赖项(在左侧项目面板内)。
你有什么主意吗。看起来并不复杂。
请从 spark-streaming 库中删除 provided
项。
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.2.0"
修改后,仍然存在依赖问题,排除重复的jar。
"org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.2.0" excludeAll(
ExclusionRule(organization = "org.spark-project.spark", name = "unused"),
ExclusionRule(organization = "org.apache.spark", name = "spark-streaming"),
ExclusionRule(organization = "org.apache.hadoop")
),
希望对您有所帮助。
谢谢
拉维
大家好,下面的代码中似乎没有找到 class StreamingContext。
import org.apache.spark.streaming.{Seconds, StreamingContext}
import org.apache.spark.{SparkConf, SparkContext}
object Exemple {
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setMaster("local[*]").setAppName("Exemple")
val sc = new SparkContext(conf)
val ssc = new StreamingContext(sc, Seconds(2)) //this line throws error
}
}
这是错误:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/StreamingContext
at Exemple$.main(Exemple.scala:16)
at Exemple.main(Exemple.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.StreamingContext
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 2 more
Process finished with exit code 1
我使用以下 build.sbt 文件:
name := "exemple"
version := "1.0.0"
scalaVersion := "2.11.11"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.2.0" % "provided"
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.2.0"
我 运行 使用 intellij 运行 按钮的示例 class 我得到了错误。在 sbt shell 它工作正常。进入我的依赖模块,我可以找到 spark 依赖项。代码在 intellij 中编译。我可以在外部库中看到火花依赖项(在左侧项目面板内)。 你有什么主意吗。看起来并不复杂。
请从 spark-streaming 库中删除 provided
项。
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.2.0"
修改后,仍然存在依赖问题,排除重复的jar。
"org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.2.0" excludeAll(
ExclusionRule(organization = "org.spark-project.spark", name = "unused"),
ExclusionRule(organization = "org.apache.spark", name = "spark-streaming"),
ExclusionRule(organization = "org.apache.hadoop")
),
希望对您有所帮助。
谢谢 拉维