通过 Spring 数据存储库保存带有大地图的 POJO 导致 StackOverflowError
Save POJO with big Map inside via Spring Data repository leads to StackOverflowError
通常:我正在从 Kafka Stream 读取序列化对象(作为 JSON)并尝试使用 Spring 数据存储库将其保存到 Redis。
在对 repository.save() 进行两次调用(对象尚未保存到 Redis)后,我得到 WhosebugError:
Exception in thread "processOffers-applicationId-1c24ef63-baae-47b9-beb7-5e6517736bc4-StreamThread-1" java.lang.WhosebugError
at org.springframework.data.util.Lazy.get(Lazy.java:94)
at org.springframework.data.mapping.model.AnnotationBasedPersistentProperty.usePropertyAccess(AnnotationBasedPersistentProperty.java:277)
at org.springframework.data.mapping.model.BeanWrapper.getProperty(BeanWrapper.java:134)
at org.springframework.data.mapping.model.BeanWrapper.getProperty(BeanWrapper.java:115)
at org.springframework.data.redis.core.convert.MappingRedisConverter.lambda$writeInternal(MappingRedisConverter.java:601)
at org.springframework.data.mapping.model.BasicPersistentEntity.doWithProperties(BasicPersistentEntity.java:353)
at org.springframework.data.redis.core.convert.MappingRedisConverter.writeInternal(MappingRedisConverter.java:597)
at org.springframework.data.redis.core.convert.MappingRedisConverter.lambda$writeInternal(MappingRedisConverter.java:639)
序列化的 POJO 看起来像这样:
@Data
@With
@NoArgsConstructor
@AllArgsConstructor
@RedisHash("students")
public class Student {
@Id
@JsonProperty("student_id")
private long id;
@JsonProperty("entities")
private Map<String, Object> entities = new HashMap<>();
}
地图实体包含 100 多个条目,带有嵌套地图(对象)。
有趣的部分:如果我将地图清空,一切正常,数据会立即保存到 Redis。
POJO对应的存储库:
@Repository
public interface StudentRepository extends CrudRepository<Student, Long> {
}
此外,我已经为 Long id 字段定义了 RedisCustomConversion:
@Component
@ReadingConverter
public class BytesToLongConverter implements Converter<byte[], Long> {
@Override
public Long convert(final byte[] source) {
ByteBuffer buffer = ByteBuffer.allocate(Long.BYTES);
buffer.put(source);
buffer.flip();
return buffer.getLong();
}
}
@Component
@WritingConverter
public class LongToBytesConverter implements Converter<Long, byte[]> {
@Override
public byte[] convert(final Long source) {
ByteBuffer buffer = ByteBuffer.allocate(Long.BYTES);
buffer.putLong(source);
return buffer.array();
}
}
Redis 配置class 如下所示:
@Configuration
@EnableRedisRepositories
public class RedisConfiguration {
@Bean
@Primary
public RedisProperties redisProperties() {
return new RedisProperties();
}
@Bean
public RedisConnectionFactory redisConnectionFactory() {
var config = new RedisStandaloneConfiguration();
var props = redisProperties();
config.setHostName(props.getHost());
config.setPort(props.getPort());
return new JedisConnectionFactory(config);
}
@Bean
public RedisTemplate<String, Object> redisTemplate() {
var template = new RedisTemplate<String, Object>();
template.setConnectionFactory(redisConnectionFactory());
template.setDefaultSerializer(new GenericJackson2JsonRedisSerializer());
return template;
}
@Bean
public RedisCustomConversions redisCustomConversions(LongToBytesConverter longToBytes,
BytesToLongConverter bytesToLong) {
return new RedisCustomConversions(Arrays.asList(longToBytes, bytesToLong));
}
}
UPD:
我在 Spring Data Redis Jira 上找到了这个 issue,但是分辨率设置为“固定”,所以我觉得很奇怪。
我已经使用 GenericJackson2JsonRedisSerializer 为 POJO 中的内部映射定义了自定义 WritingConverter 和 ReadingConverter,一切顺利!
代码:
@Component
@WritingConverter
public class FieldsToBytesConverter implements Converter<Map<String, Object>, byte[]> {
private final RedisSerializer serializer;
public FieldsToBytesConverter() {
serializer = new GenericJackson2JsonRedisSerializer();
}
@Override
public byte[] convert(Map<String, Object> value) {
return serializer.serialize(value);
}
}
@Component
@ReadingConverter
public class BytesToFieldsConverter implements Converter<byte[], Map<String, Object>> {
private final GenericJackson2JsonRedisSerializer serializer;
public BytesToFieldsConverter() {
serializer = new GenericJackson2JsonRedisSerializer();
}
@Override
public Map<String, Object> convert(byte[] value) {
return (Map<String, Object>) serializer.deserialize(value);
}
}
通常:我正在从 Kafka Stream 读取序列化对象(作为 JSON)并尝试使用 Spring 数据存储库将其保存到 Redis。
在对 repository.save() 进行两次调用(对象尚未保存到 Redis)后,我得到 WhosebugError:
Exception in thread "processOffers-applicationId-1c24ef63-baae-47b9-beb7-5e6517736bc4-StreamThread-1" java.lang.WhosebugError
at org.springframework.data.util.Lazy.get(Lazy.java:94)
at org.springframework.data.mapping.model.AnnotationBasedPersistentProperty.usePropertyAccess(AnnotationBasedPersistentProperty.java:277)
at org.springframework.data.mapping.model.BeanWrapper.getProperty(BeanWrapper.java:134)
at org.springframework.data.mapping.model.BeanWrapper.getProperty(BeanWrapper.java:115)
at org.springframework.data.redis.core.convert.MappingRedisConverter.lambda$writeInternal(MappingRedisConverter.java:601)
at org.springframework.data.mapping.model.BasicPersistentEntity.doWithProperties(BasicPersistentEntity.java:353)
at org.springframework.data.redis.core.convert.MappingRedisConverter.writeInternal(MappingRedisConverter.java:597)
at org.springframework.data.redis.core.convert.MappingRedisConverter.lambda$writeInternal(MappingRedisConverter.java:639)
序列化的 POJO 看起来像这样:
@Data
@With
@NoArgsConstructor
@AllArgsConstructor
@RedisHash("students")
public class Student {
@Id
@JsonProperty("student_id")
private long id;
@JsonProperty("entities")
private Map<String, Object> entities = new HashMap<>();
}
地图实体包含 100 多个条目,带有嵌套地图(对象)。
有趣的部分:如果我将地图清空,一切正常,数据会立即保存到 Redis。
POJO对应的存储库:
@Repository
public interface StudentRepository extends CrudRepository<Student, Long> {
}
此外,我已经为 Long id 字段定义了 RedisCustomConversion:
@Component
@ReadingConverter
public class BytesToLongConverter implements Converter<byte[], Long> {
@Override
public Long convert(final byte[] source) {
ByteBuffer buffer = ByteBuffer.allocate(Long.BYTES);
buffer.put(source);
buffer.flip();
return buffer.getLong();
}
}
@Component
@WritingConverter
public class LongToBytesConverter implements Converter<Long, byte[]> {
@Override
public byte[] convert(final Long source) {
ByteBuffer buffer = ByteBuffer.allocate(Long.BYTES);
buffer.putLong(source);
return buffer.array();
}
}
Redis 配置class 如下所示:
@Configuration
@EnableRedisRepositories
public class RedisConfiguration {
@Bean
@Primary
public RedisProperties redisProperties() {
return new RedisProperties();
}
@Bean
public RedisConnectionFactory redisConnectionFactory() {
var config = new RedisStandaloneConfiguration();
var props = redisProperties();
config.setHostName(props.getHost());
config.setPort(props.getPort());
return new JedisConnectionFactory(config);
}
@Bean
public RedisTemplate<String, Object> redisTemplate() {
var template = new RedisTemplate<String, Object>();
template.setConnectionFactory(redisConnectionFactory());
template.setDefaultSerializer(new GenericJackson2JsonRedisSerializer());
return template;
}
@Bean
public RedisCustomConversions redisCustomConversions(LongToBytesConverter longToBytes,
BytesToLongConverter bytesToLong) {
return new RedisCustomConversions(Arrays.asList(longToBytes, bytesToLong));
}
}
UPD: 我在 Spring Data Redis Jira 上找到了这个 issue,但是分辨率设置为“固定”,所以我觉得很奇怪。
我已经使用 GenericJackson2JsonRedisSerializer 为 POJO 中的内部映射定义了自定义 WritingConverter 和 ReadingConverter,一切顺利! 代码:
@Component
@WritingConverter
public class FieldsToBytesConverter implements Converter<Map<String, Object>, byte[]> {
private final RedisSerializer serializer;
public FieldsToBytesConverter() {
serializer = new GenericJackson2JsonRedisSerializer();
}
@Override
public byte[] convert(Map<String, Object> value) {
return serializer.serialize(value);
}
}
@Component
@ReadingConverter
public class BytesToFieldsConverter implements Converter<byte[], Map<String, Object>> {
private final GenericJackson2JsonRedisSerializer serializer;
public BytesToFieldsConverter() {
serializer = new GenericJackson2JsonRedisSerializer();
}
@Override
public Map<String, Object> convert(byte[] value) {
return (Map<String, Object>) serializer.deserialize(value);
}
}