这是 iOS / Android 上的内存泄漏吗

is this memory leak on iOS / Android

https://i.stack.imgur.com/HBPG7.png

所以每次我训练模型时,numTensors 都在增加,我只是想知道这是否不好? 当我关闭应用程序并开始加载模型并训练新模型时,它会不断增加 我有那么多型号是不是很糟糕?以及如何解决这个问题?

我正在使用 knn-classifier 来添加示例

      const collectData = async (className) => {
    console.log(`[+] Class ${className} selected`)
    setStatus(statusList[1])
    setIsLoading(true)
    try {
      if (this.camera) {
        let photo = await this.camera.takePictureAsync({
          skipProcessing: true,
        });
        //2. resize images into width:224 height:224
        image = await resizeImage(photo.uri, 224, 224);
        let imageTensor = base64ImageToTensor(image.base64);
        console.log(imageTensor + " imagTensor")
        //3. get embeddings from mobilenet
        let embeddings = await mobilenetModel.infer(imageTensor, true);
        console.log(embeddings + " embeddings")
        //4. train knn classifier
        knnClassifierModel.addExample(embeddings, className)
        let tempCountExamples = countExamples + 1
        let tempCountClassExamples = countClassExamples
        tempCountClassExamples[`${className}`] = tempCountClassExamples[`${className}`] + 1
        setCountExamples(tempCountExamples)
        setCountClassExamples(tempCountClassExamples)
        console.log("[+] Class Added")
      }
    } catch {
      console.log("[-] No Camera")
    }
    setIsLoading(false)
  }

所以我用 tf.tidy 和 tf.engine 更新了代码 收集数据功能

    collectData = async () => {
    if (this.camera && this.state.label != "" && this.state.label!= null) {
        try {
            tf.engine().startScope()
            let photo = await this.camera.takePictureAsync({
                skipProcessing: true,
            });
            //2. resize images into width:224 height:224
            const image = await this.resizeImage(photo.uri, 224, 224);
            let imageTensor = this.base64ImageToTensor(image.base64);
            //3. get embeddings from mobilenet
            console.log("=========== before dispose ===========\n " + JSON.stringify(tf.memory()) + "=================================")
            // do your thing
            let embeddings = await this.model.infer(imageTensor, true);
            tf.dispose(imageTensor);
            //4. train knn classifier
            this.knnClass.addExample(embeddings, this.state.label)
            let dataset = this.knnClass.getClassifierDataset()
            let stringDataset = JSON.stringify(Object.entries(dataset).map(([label, data]) => [label, Array.from(data.dataSync()), data.shape]))
            tf.engine().endScope()
            console.log("=========== after dispose ===========\n " + JSON.stringify(tf.memory()) + "=================================")
        } catch (err) {
            console.log('error ' + err)
        }
    } else {
        this.setState({ modalVisible: true });
    }
}

和捕捉/训练模型时调用的函数

takePicture = async function () {
    console.log('snap hit!')
    await tf.tidy(() => { this.collectData(); return undefined; })
}

是的,您必须在对张量进行预测后对其进行处理,否则张量会累积并导致内存泄漏。您可以为此使用 tf.tidy()

张量正在处理,但 collectData() 处理的值改为 returned。您可以将函数作为参数传递给 tf.tidy() 而不是 return 任何张量:

tf.tidy(() => { collectData(); return undefined; })

您还可以使用它来清理异步代码中任何未使用的张量:

tf.engine().startScope()
// do your thing
tf.engine().endScope()