ExceptionInInitializerError 星火流卡夫卡

ExceptionInInitializerError Spark Streaming Kafka

我正在尝试在一个简单的应用程序中将 Spark Streaming 连接到 Kafka。我通过 Spark 文档中的示例创建了这个应用程序。当我尝试 运行 时,我得到了这样一个例外:

Exception in thread "main" java.lang.ExceptionInInitializerError
    at org.apache.spark.streaming.dstream.InputDStream.<init>(InputDStream.scala:80)
    at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.<init>(DirectKafkaInputDStream.scala:59)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:147)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:124)
    at producer.KafkaProducer$.main(KafkaProducer.scala:36)
    at producer.KafkaProducer.main(KafkaProducer.scala)
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.9.4
    at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64)
    at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
    at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:751)
    at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)

这是我的代码:

object KafkaProducer {

  def main(args: Array[String]) {

    val spark = SparkSession
      .builder()
      .appName("KafkaSparkStreaming")
      .master("local[*]")
      .getOrCreate()

    val ssc = new StreamingContext(spark.sparkContext, Seconds(3))
    val topics = Array("topic1", "topic2")

    def kafkaParams = Map[String, Object](
      "bootstrap.servers" -> "localhost:9092",
      "key.deserializer" -> classOf[StringDeserializer],
      "value.deserializer" -> classOf[StringDeserializer],
      "group.id" -> "1",
      "auto.offset.reset" -> "latest",
      "enable.auto.commit" -> (false: java.lang.Boolean)
    )

    val lines = KafkaUtils.createDirectStream[String, String](
      ssc,
      LocationStrategies.PreferConsistent,
      ConsumerStrategies.Subscribe[String, String](topics, kafkaParams)
    )
    lines.map(_.key())

    ssc.start()
    ssc.awaitTermination()

我不确定问题是出在配置上还是代码 itself.Tha 我的 build.sbt 文件是这样的:

scalaVersion := "2.11.4"

resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"

libraryDependencies ++= Seq(
  "org.apache.kafka" %% "kafka" % "1.1.0",
  "org.apache.spark" %% "spark-core" % "2.3.0",
  "org.apache.spark" %% "spark-sql" % "2.3.0",
  "org.apache.spark" %% "spark-streaming" % "2.3.0",
  "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.3.0"
)

如果有任何帮助,我将不胜感激,因为我不知道出了什么问题!

通过跟踪您遇到的异常的堆栈跟踪,我们可以找出主要问题是:

Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.9.4

事实上

Spark 2.1.0 contains com.fasterxml.jackson.core as transitive dependency. So, we do not need to include then in libraryDependencies.

对类似的问题及其解决方案进行了更详细的描述