批量插入时 Slick 3.0 中的数据库异常
Database Exception in Slick 3.0 while batch insert
在 slick 3 中通过批量插入每五秒插入数千条记录时,我得到
org.postgresql.util.PSQLException: FATAL: sorry, too many clients already
我的数据访问层看起来像:
val db: CustomPostgresDriver.backend.DatabaseDef = Database.forURL(url, user=user, password=password, driver= jdbcDriver)
override def insertBatch(rowList: List[T#TableElementType]): Future[Long] = {
val res = db.run(insertBatchQuery(rowList)).map(_.head.toLong).recover{ case ex:Throwable=> RelationalRepositoryUtility.handleBatchOperationErrors(ex)}
//db.close()
res
}
override def insertBatchQuery(rowList: List[T#TableElementType]): FixedSqlAction[Option[Int], NoStream, Write] = {
query ++= (rowList)
}
在插入批处理中关闭连接没有任何效果...它仍然给出相同的错误。
我像这样从我的代码中调用插入批处理:
val temp1 = list1.flatMap { li =>
Future.sequence(li.map { trip =>
val data = for {
tripData <- TripDataRepository.insertQuery( trip.tripData)
subTripData <- SubTripDataRepository.insertBatchQuery(getUpdatedSubTripDataList(trip.subTripData, tripData.id))
} yield ((tripData, subTripData))
val res=db.run(data.transactionally)
res
//db.close()
})
}
如果我在这里工作后关闭连接,如您在注释代码中看到的那样,我会收到错误消息:
java.util.concurrent.RejectedExecutionException: Task slick.backend.DatabaseComponent$DatabaseDef$$anon@6c3ae2b6 rejected from java.util.concurrent.ThreadPoolExecutor@79d2d4eb[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 1]
在没有像这样 Future.sequence 的情况下调用方法后:
val temp1 =list.map { trip =>
val data = for {
tripData <- TripDataRepository.insertQuery( trip.tripData)
subTripData <- SubTripDataRepository.insertBatchQuery(getUpdatedSubTripDataList(trip.subTripData, tripData.id))
} yield ((tripData, subTripData))
val res=db.run(data.transactionally)
res
}
我仍然有太多客户端错误...
这个问题的根源在于您同时启动了一个 Future
的无限列表,每个列表都连接到数据库 - list
中的每个条目一个。
这可以通过 运行 串行插入来解决,强制每个插入批次依赖于前一个:
// Empty Future for the results. Replace Unit with the correct type - whatever
// "res" is below.
val emptyFuture = Future.successful(Seq.empty[Unit])
// This will only insert one at a time. You could use list.sliding to batch the
// inserts if that was important.
val temp1 = list.foldLeft(emptyFuture) { (previousFuture, trip) =>
previousFuture flatMap { previous =>
// Inner code copied from your example.
val data = for {
tripData <- TripDataRepository.insertQuery(trip.tripData)
subTripData <- SubTripDataRepository.insertBatchQuery(getUpdatedSubTripDataList(trip.subTripData, tripData.id))
} yield ((tripData, subTripData))
val res = db.run(data.transactionally)
previous :+ res
}
}
在 slick 3 中通过批量插入每五秒插入数千条记录时,我得到
org.postgresql.util.PSQLException: FATAL: sorry, too many clients already
我的数据访问层看起来像:
val db: CustomPostgresDriver.backend.DatabaseDef = Database.forURL(url, user=user, password=password, driver= jdbcDriver)
override def insertBatch(rowList: List[T#TableElementType]): Future[Long] = {
val res = db.run(insertBatchQuery(rowList)).map(_.head.toLong).recover{ case ex:Throwable=> RelationalRepositoryUtility.handleBatchOperationErrors(ex)}
//db.close()
res
}
override def insertBatchQuery(rowList: List[T#TableElementType]): FixedSqlAction[Option[Int], NoStream, Write] = {
query ++= (rowList)
}
在插入批处理中关闭连接没有任何效果...它仍然给出相同的错误。
我像这样从我的代码中调用插入批处理:
val temp1 = list1.flatMap { li =>
Future.sequence(li.map { trip =>
val data = for {
tripData <- TripDataRepository.insertQuery( trip.tripData)
subTripData <- SubTripDataRepository.insertBatchQuery(getUpdatedSubTripDataList(trip.subTripData, tripData.id))
} yield ((tripData, subTripData))
val res=db.run(data.transactionally)
res
//db.close()
})
}
如果我在这里工作后关闭连接,如您在注释代码中看到的那样,我会收到错误消息:
java.util.concurrent.RejectedExecutionException: Task slick.backend.DatabaseComponent$DatabaseDef$$anon@6c3ae2b6 rejected from java.util.concurrent.ThreadPoolExecutor@79d2d4eb[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 1]
在没有像这样 Future.sequence 的情况下调用方法后:
val temp1 =list.map { trip =>
val data = for {
tripData <- TripDataRepository.insertQuery( trip.tripData)
subTripData <- SubTripDataRepository.insertBatchQuery(getUpdatedSubTripDataList(trip.subTripData, tripData.id))
} yield ((tripData, subTripData))
val res=db.run(data.transactionally)
res
}
我仍然有太多客户端错误...
这个问题的根源在于您同时启动了一个 Future
的无限列表,每个列表都连接到数据库 - list
中的每个条目一个。
这可以通过 运行 串行插入来解决,强制每个插入批次依赖于前一个:
// Empty Future for the results. Replace Unit with the correct type - whatever
// "res" is below.
val emptyFuture = Future.successful(Seq.empty[Unit])
// This will only insert one at a time. You could use list.sliding to batch the
// inserts if that was important.
val temp1 = list.foldLeft(emptyFuture) { (previousFuture, trip) =>
previousFuture flatMap { previous =>
// Inner code copied from your example.
val data = for {
tripData <- TripDataRepository.insertQuery(trip.tripData)
subTripData <- SubTripDataRepository.insertBatchQuery(getUpdatedSubTripDataList(trip.subTripData, tripData.id))
} yield ((tripData, subTripData))
val res = db.run(data.transactionally)
previous :+ res
}
}