使用spark-csv的重载方法错误

overloaded method error using spark-csv

我正在使用 Databricks spark-csv 包(通过 Scala API),在定义自定义模式时遇到问题。

使用

启动控制台后
spark-shell  --packages com.databricks:spark-csv_2.11:1.2.0

我导入我需要的类型

import org.apache.spark.sql.types.{StructType, StructField, StringType, IntegerType}

然后简单地尝试定义这个模式:

val customSchema = StructType(
    StructField("user_id", IntegerType, true),
    StructField("item_id", IntegerType, true),
    StructField("artist_id", IntegerType, true),
    StructField("scrobble_time", StringType, true))

但我收到以下错误:

<console>:26: error: overloaded method value apply with alternatives:
  (fields: Array[org.apache.spark.sql.types.StructField])org.apache.spark.sql.types.StructType <and>
  (fields: java.util.List[org.apache.spark.sql.types.StructField])org.apache.spark.sql.types.StructType <and>
  (fields: Seq[org.apache.spark.sql.types.StructField])org.apache.spark.sql.types.StructType
 cannot be applied to (org.apache.spark.sql.types.StructField, org.apache.spark.sql.types.StructField, org.apache.spark.sql.types.StructField, org.apache.spark.sql.types.StructField)
       val customSchema = StructType(

我是 scala 的新手,所以在解析这个时遇到了问题,但我在这里做错了什么?我正在关注非常简单的示例 here.

您需要将 StructField 组作为 Seq 传递。

类似于以下任何作品:

val customSchema = StructType(Seq(StructField("user_id", IntegerType, true), StructField("item_id", IntegerType, true), StructField("artist_id", IntegerType, true), StructField("scrobble_time", StringType, true)))

val customSchema = (new StructType)
  .add("user_id", IntegerType, true)
  .add("item_id", IntegerType, true)
  .add("artist_id", IntegerType, true)
  .add("scrobble_time", StringType, true)

val customSchema = StructType(StructField("user_id", IntegerType, true) :: StructField("item_id", IntegerType, true) :: StructField("artist_id", IntegerType, true) :: StructField("scrobble_time", StringType, true) :: Nil)

我不确定为什么它在 README 中没有这样显示,但是如果您查看 StructType 文档,就会清楚这一点。