SCALA 和 Elastic Search:将符号添加到类路径 (Databricks)
SCALA and Elastic Search: Add symbol to classpath (Databricks)
:)
我收到一个错误,我不知道如何修复。我真的找不到关于此 SchemaRDD 类型以及如何使用它的优秀文档。
build.sbt 包含:
scalaVersion := "2.11.12"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.0"
libraryDependencies += "org.scalaj" %% "scalaj-http" % "2.4.1"
libraryDependencies += "io.spray" %% "spray-json" % "1.3.5"
libraryDependencies += "com.amazonaws" % "aws-java-sdk-core" % "1.11.534"
libraryDependencies += "com.amazonaws" % "aws-encryption-sdk-java" % "1.3.6"
libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.11.550"
libraryDependencies += "com.typesafe" % "config" % "1.3.4"
libraryDependencies += "org.elasticsearch" %% "elasticsearch-spark-1.2" % "2.4.4"
错误:
Symbol 'type org.apache.spark.sql.SchemaRDD' is missing from the classpath.
[error] This symbol is required by 'value org.elasticsearch.spark.sql.package.rdd'.
[error] Make sure that type SchemaRDD is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
[error] A full rebuild may help if 'package.class' was compiled against an incompatible version of org.apache.spark.sql.
非常感谢大家的支持! :)
Dependency elasticsearch-spark-1.2
用于 Spark 1.x,需要改用 elasticsearch-spark-20
。 The latest version 专为 Spark 2.3 而构建
libraryDependencies += "org.elasticsearch" %% "elasticsearch-spark-20" % "7.1.1"
:)
我收到一个错误,我不知道如何修复。我真的找不到关于此 SchemaRDD 类型以及如何使用它的优秀文档。
build.sbt 包含:
scalaVersion := "2.11.12"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.0"
libraryDependencies += "org.scalaj" %% "scalaj-http" % "2.4.1"
libraryDependencies += "io.spray" %% "spray-json" % "1.3.5"
libraryDependencies += "com.amazonaws" % "aws-java-sdk-core" % "1.11.534"
libraryDependencies += "com.amazonaws" % "aws-encryption-sdk-java" % "1.3.6"
libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.11.550"
libraryDependencies += "com.typesafe" % "config" % "1.3.4"
libraryDependencies += "org.elasticsearch" %% "elasticsearch-spark-1.2" % "2.4.4"
错误:
Symbol 'type org.apache.spark.sql.SchemaRDD' is missing from the classpath.
[error] This symbol is required by 'value org.elasticsearch.spark.sql.package.rdd'.
[error] Make sure that type SchemaRDD is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
[error] A full rebuild may help if 'package.class' was compiled against an incompatible version of org.apache.spark.sql.
非常感谢大家的支持! :)
Dependency elasticsearch-spark-1.2
用于 Spark 1.x,需要改用 elasticsearch-spark-20
。 The latest version 专为 Spark 2.3 而构建
libraryDependencies += "org.elasticsearch" %% "elasticsearch-spark-20" % "7.1.1"