为什么 tensorflow 模型的内存不从 RAM 中清除?

Why doesn't a tensorflow model's memory clear from RAM?

我想创建不同模型的种群并每一代更新种群。这意味着我必须每代创建一个新的模型列表。每次清除模型列表时,内存都会出现停留,并且会逐代累积,直到耗尽所有内存。

这是我的简化代码:

import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, LSTM

class population:
    def __init__(self):
        self.pop = []

    def createModel(self, inputShape):
        model = Sequential([
                LSTM(64, input_shape=(inputShape), return_sequences=True),
                LSTM(64, input_shape=(inputShape)),
                Dense(32, activation="relu"),
                Dense(2, activation="softmax"),
        ])
        return model

    def createList(self, numModels):
        model_list = []
        for _ in range(numModels):
            model = self.createModel((5,5))
            model_list.append(model)
        
        return model_list

    def updateList(self, iterations):
        for i in range(iterations):
            print(f"Generation: {i+1}")
            self.pop.clear()
            self.pop = self.createList(50)

pop = population()
pop.updateList(10)

您可能想使用 tf.keras.backend.clear_session:

Keras manages a global state, which it uses to implement the Functional model-building API and to uniquify autogenerated layer names.

If you are creating many models in a loop, this global state will consume an increasing amount of memory over time, and you may want to clear it. Calling clear_session() releases the global state: this helps avoid clutter from old models and layers, especially when memory is limited.

您还可以阅读this answer