jackson databind Json 序列化正在写入 Some(99): Option[Int] as {"empty":true,"defined":false}
jackson databind Json serialization is writing Some(99): Option[Int] as {"empty":true,"defined":false}
我在我的 Scala/Spark 程序中使用了 Jackson,我已经将我的问题提炼成一个简单的例子
以下。我的问题是,当我的案例 class 的 Option[Int] 字段(年龄)设置为 None
我看到了合理的反序列化输出(即:空=真的结构)。然而,当
age 已定义,即设置为某些 Int,如 Some(99),但我从未在
反序列化输出 .
给定:
import com.fasterxml.jackson.databind.ObjectMapper
import java.io.ByteArrayOutputStream
import scala.beans.BeanProperty
case class Dog(@BeanProperty name: String, @BeanProperty age: Option[Integer])
object OtherTest extends App {
jsonOut(Dog("rex", None))
jsonOut(Dog("mex", Some(99)))
private def jsonOut(dog: Dog) = {
val mapper = new ObjectMapper();
val stream = new ByteArrayOutputStream();
mapper.writeValue(stream, dog);
System.out.println("result:" + stream.toString());
}
}
我的输出如下图所示。任何 hints/help 非常感谢!
result:{"name":"rex","age":{"empty":true,"defined":false}}
result:{"name":"mex","age":{"empty":false,"defined":true}}
有帮助的回答后更新
以下是对我有用的依赖项:
implementation 'org.scala-lang:scala-library:2.12.2'
implementation "org.apache.spark:spark-sql_2.12:3.1.2"
implementation "org.apache.spark:spark-sql-kafka-0-10_2.12:3.1.2"
implementation "org.apache.spark:spark-avro_2.12:3.1.2"
implementation 'com.fasterxml.jackson.module:jackson-module-scala_2.12:2.10.0'
这是更新后的代码(带有飞行常客奖励 - 往返示例):
private def jsonOut(dog: Dog) = {
val mapper = new ObjectMapper()
mapper.registerModule(DefaultScalaModule)
val stream = new ByteArrayOutputStream();
mapper.writeValue(stream, dog);
val serialized = stream.toString()
System.out.println("result:" + serialized);
// verify we can read the serialized thing back to case class:
val recovered = mapper.readValue(serialized, classOf[Dog])
System.out.println("here is what we read back:" + recovered);
}
这是结果输出(如现在预期的那样 ;^)->
> Task :OtherTest.main()
result:{"name":"rex","age":null}
here is what we read back:Dog(rex,None)
result:{"name":"mex","age":99}
here is what we read back:Dog(mex,Some(99))
您需要为 Scala 添加 Jackson 模块以使其使用标准 Scala 数据类型。
- 将此模块添加为您的依赖项:https://github.com/FasterXML/jackson-module-scala
- 按照自述文件了解如何使用此模块初始化 ObjectMapper。
我在我的 Scala/Spark 程序中使用了 Jackson,我已经将我的问题提炼成一个简单的例子 以下。我的问题是,当我的案例 class 的 Option[Int] 字段(年龄)设置为 None 我看到了合理的反序列化输出(即:空=真的结构)。然而,当 age 已定义,即设置为某些 Int,如 Some(99),但我从未在 反序列化输出 .
给定:
import com.fasterxml.jackson.databind.ObjectMapper
import java.io.ByteArrayOutputStream
import scala.beans.BeanProperty
case class Dog(@BeanProperty name: String, @BeanProperty age: Option[Integer])
object OtherTest extends App {
jsonOut(Dog("rex", None))
jsonOut(Dog("mex", Some(99)))
private def jsonOut(dog: Dog) = {
val mapper = new ObjectMapper();
val stream = new ByteArrayOutputStream();
mapper.writeValue(stream, dog);
System.out.println("result:" + stream.toString());
}
}
我的输出如下图所示。任何 hints/help 非常感谢!
result:{"name":"rex","age":{"empty":true,"defined":false}}
result:{"name":"mex","age":{"empty":false,"defined":true}}
有帮助的回答后更新
以下是对我有用的依赖项:
implementation 'org.scala-lang:scala-library:2.12.2'
implementation "org.apache.spark:spark-sql_2.12:3.1.2"
implementation "org.apache.spark:spark-sql-kafka-0-10_2.12:3.1.2"
implementation "org.apache.spark:spark-avro_2.12:3.1.2"
implementation 'com.fasterxml.jackson.module:jackson-module-scala_2.12:2.10.0'
这是更新后的代码(带有飞行常客奖励 - 往返示例):
private def jsonOut(dog: Dog) = {
val mapper = new ObjectMapper()
mapper.registerModule(DefaultScalaModule)
val stream = new ByteArrayOutputStream();
mapper.writeValue(stream, dog);
val serialized = stream.toString()
System.out.println("result:" + serialized);
// verify we can read the serialized thing back to case class:
val recovered = mapper.readValue(serialized, classOf[Dog])
System.out.println("here is what we read back:" + recovered);
}
这是结果输出(如现在预期的那样 ;^)->
> Task :OtherTest.main()
result:{"name":"rex","age":null}
here is what we read back:Dog(rex,None)
result:{"name":"mex","age":99}
here is what we read back:Dog(mex,Some(99))
您需要为 Scala 添加 Jackson 模块以使其使用标准 Scala 数据类型。
- 将此模块添加为您的依赖项:https://github.com/FasterXML/jackson-module-scala
- 按照自述文件了解如何使用此模块初始化 ObjectMapper。