除了遍历 MongoDB 的游标,还有什么更好的方法吗?
Is there any better way other than iterating through cursor of MongoDB?
我在我的代码中执行以下业务逻辑
Cursor cursor = db.getCollection(CollectionInfo.ORDER).aggregate(pipelines, options);
while (cursor.hasNext()) {
orders.add(new Order(cursor.next().toMap()));
}
return orders;
因此,为了构建对象,我正在遍历所有文档。所以如果我有很多文档作为聚合的结果。迭代需要很多时间。有没有更好的方法来做到这一点?使用 Spring 数据 mongo 数据库或其他任何东西?
使用mongoTemplate.aggregate(...).getMappedResults()
@Autowired
protected MongoTemplate mongoTemplate;
...
List<AggregationOperation> pipeline = ...;
Aggregation agg = Aggregation.newAggregation(pipeline)
.withOptions(...);
List<Orders> orders = mongoTemplate.aggregate(agg, CollectionInfo.ORDER, Order.class)
.getMappedResults();
注意:如果你查询returns超过1K的文档,建议使用$skip
/ $limit
伪代码
skip = 0
limit = 1_000
size = 0
do {
//Add skip / limit (do not insert these stages more than once)
pipeline.add(Aggregation.skip(skip))
pipeline.add(Aggregation.limit(limit))
//aggregate
result = mongoTemplate.aggregate(...).getMappedResults();
size = result.size();
skip += limit;
// If there are 1_000 results, probably there are more items, keep iterating
} while(size == limit);
我在我的代码中执行以下业务逻辑
Cursor cursor = db.getCollection(CollectionInfo.ORDER).aggregate(pipelines, options);
while (cursor.hasNext()) {
orders.add(new Order(cursor.next().toMap()));
}
return orders;
因此,为了构建对象,我正在遍历所有文档。所以如果我有很多文档作为聚合的结果。迭代需要很多时间。有没有更好的方法来做到这一点?使用 Spring 数据 mongo 数据库或其他任何东西?
使用mongoTemplate.aggregate(...).getMappedResults()
@Autowired
protected MongoTemplate mongoTemplate;
...
List<AggregationOperation> pipeline = ...;
Aggregation agg = Aggregation.newAggregation(pipeline)
.withOptions(...);
List<Orders> orders = mongoTemplate.aggregate(agg, CollectionInfo.ORDER, Order.class)
.getMappedResults();
注意:如果你查询returns超过1K的文档,建议使用$skip
/ $limit
伪代码
skip = 0
limit = 1_000
size = 0
do {
//Add skip / limit (do not insert these stages more than once)
pipeline.add(Aggregation.skip(skip))
pipeline.add(Aggregation.limit(limit))
//aggregate
result = mongoTemplate.aggregate(...).getMappedResults();
size = result.size();
skip += limit;
// If there are 1_000 results, probably there are more items, keep iterating
} while(size == limit);