sbt 未解决的对 spark streaming Kafka 集成的依赖
sbt unresolved dependency for spark streaming Kafka integration
我想使用 Spark 流的 Kafka 集成。我使用 Spark 版本 2.0.0.
但是我得到一个未解决的依赖错误 ("unresolved dependency: org.apache.spark#spark-sql-kafka-0-10_2.11;2.0.0: not found")。
我怎样才能访问这个包?或者我在做什么wrong/missing?
我的 build.sbt 文件:
name := "Spark Streaming"
version := "0.1"
scalaVersion := "2.11.11"
val sparkVersion = "2.0.0"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion,
"org.apache.spark" %% "spark-streaming" % sparkVersion,
"org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion
)
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.0.0-preview"
谢谢你的帮助。
https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10_2.11
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.0.0"
我想使用 Spark 流的 Kafka 集成。我使用 Spark 版本 2.0.0.
但是我得到一个未解决的依赖错误 ("unresolved dependency: org.apache.spark#spark-sql-kafka-0-10_2.11;2.0.0: not found")。
我怎样才能访问这个包?或者我在做什么wrong/missing?
我的 build.sbt 文件:
name := "Spark Streaming"
version := "0.1"
scalaVersion := "2.11.11"
val sparkVersion = "2.0.0"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion,
"org.apache.spark" %% "spark-streaming" % sparkVersion,
"org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion
)
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.0.0-preview"
谢谢你的帮助。
https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10_2.11
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.0.0"