Python 中的多重处理,用于同时训练神经网络

Multiprocessing in Python for training neural networks simultaneously

我有遗传算法,可以为我搜索神经网络的超参数。我有 10 代,每一代都会生成 20 个神经网络。但是现在对于每一代我都在一次训练一个网络。因此,需要很长时间。相反,我尝试进行多处理,在每一代中并行训练所有 20 个神经网络。但是,当我这样做时,我的神经网络信息没有更新。这是我一一训练 20 个神经网络所做的:

def train_networks(networks, dataset):
    """Train each network.
    Args:
        networks (list): Current population of networks
        dataset (str): Dataset to use for training/evaluating
    """


    print('training each network')
    pbar = tqdm(total=len(networks))
    for network in networks:
        print('training network - inside loop')
        network.train(dataset)
        pbar.update(1)
    pbar.close()
    print('done training')

我想在这里使用多处理。对于多处理,我做了以下操作:

def train_networks(networks, dataset):
    """Train each network.
    Args:
        networks (list): Current population of networks
        dataset (str): Dataset to use for training/evaluating
    """


    for network in networks:
        p = multiprocessing.Process(target=network.train,args=(dataset,))
        p.start()
        p.join()

但这并不能奏效。我如何修改我的代码,以便并行训练所有 20 个网络。帮助将不胜感激。

p.join() 方法会停止脚本的任何进一步执行,直到进程 p 完成。这是一种方法

processes = []
for network in networks:
   p = multiprocessing.Process(target=network.train,args=(dataset,))
   p.start()
   processes.append(p)

# Now you can wait for the networks to finish training before executing the 
# rest of the script

for process in processes:
   process.join()

Here's a nice resource on multiprocessing