sbt error: object spark is not a member of package org.apache
sbt error: object spark is not a member of package org.apache
我安装了 sbt-1.3.4.msi
并且在尝试构建示例 SparkPi.scala
应用程序时,出现以下错误:
C:\myapps\sbt\sparksample>sbt
[info] Loading project definition from C:\myapps\sbt\sparksample\project
[info] Compiling 1 Scala source to C:\myapps\sbt\sparksample\project\target\scala-2.12\sbt-1.0\classes ...
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:3:19: object spark is not a member of package org.apache
[error] import org.apache.spark._
[error] ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:8:20: not found: type SparkConf
[error] val conf = new SparkConf().setAppName("Spark Pi")
[error] ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:9:21: not found: type SparkContext
[error] val spark = new SparkContext(conf)
[error] ^
[error] three errors found
[error] (Compile / compileIncremental) Compilation failed
SparkPi.scala
文件在 C:\myapps\sbt\sparksample\project\src\main\scala
中(如上面的错误消息所示)。
我在这里错过了什么?
C:\myapps\sbt\sparksample\sparksample.sbt
文件如下:
name := "Spark Sample"
version := "1.0"
scalaVersion := "2.12.10"
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.0.0"
C:\myapps\sbt\sparksample\project\src\main\scala directory has SparkPi.scala file
这就是问题所在。您在 project
目录下有 Scala 文件,该目录属于 sbt 本身(不是您的 sbt 管理的 Scala 项目)。
将 SparkPi.scala
和其他 Scala 文件移动到 C:\myapps\sbt\sparksample\src\main\scala
。
我安装了 sbt-1.3.4.msi
并且在尝试构建示例 SparkPi.scala
应用程序时,出现以下错误:
C:\myapps\sbt\sparksample>sbt
[info] Loading project definition from C:\myapps\sbt\sparksample\project
[info] Compiling 1 Scala source to C:\myapps\sbt\sparksample\project\target\scala-2.12\sbt-1.0\classes ...
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:3:19: object spark is not a member of package org.apache
[error] import org.apache.spark._
[error] ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:8:20: not found: type SparkConf
[error] val conf = new SparkConf().setAppName("Spark Pi")
[error] ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:9:21: not found: type SparkContext
[error] val spark = new SparkContext(conf)
[error] ^
[error] three errors found
[error] (Compile / compileIncremental) Compilation failed
SparkPi.scala
文件在 C:\myapps\sbt\sparksample\project\src\main\scala
中(如上面的错误消息所示)。
我在这里错过了什么?
C:\myapps\sbt\sparksample\sparksample.sbt
文件如下:
name := "Spark Sample"
version := "1.0"
scalaVersion := "2.12.10"
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.0.0"
C:\myapps\sbt\sparksample\project\src\main\scala directory has SparkPi.scala file
这就是问题所在。您在 project
目录下有 Scala 文件,该目录属于 sbt 本身(不是您的 sbt 管理的 Scala 项目)。
将 SparkPi.scala
和其他 Scala 文件移动到 C:\myapps\sbt\sparksample\src\main\scala
。