python 多处理;一个处理器处理每个字典键的所有值

python multiprocessing; one processor for all values of each dictionary key

一般来说,使用pool和starmap,如果我们有

if __name__ == '__main__':
    with multiprocessing.Pool() as p:
        temp_arr = p.starmap(process, tuple_list)

tuple_list = [(1,2), (3, 4)],例如,导致 process(1,2) 和 process(3,4),每个分配给不同的处理器。

如果我有:

dict = {'0': [(1,1), (2,3)], '1': [(4,4)], '2': [(2,4), (3,5)]}:

有没有一种方法可以使用 Pool 以便键“0”的所有值一次性进入第一个处理器(作为元组列表 [(1,1), (2,3)],例如,以便我稍后可以在 process() 中单独处理每个元组),键“1”值进入第二个处理器,依此类推。

提前致谢。

试试这个:

import multiprocessing as mp
import time
dict = {'0': [(1,1), (2,3)], '1': [(4,4)], '2': [(2,4), (3,5)]}

def process(tup):
    print(f"input tuple: {tup} -- worker_id: {mp.current_process()}\n")
    time.sleep(2)

def process_all(index):
    for tup in dict[index]:
        process(tup)

with mp.Pool() as p:
    temp_arr = p.starmap(process_all, dict.keys())

# Result
#input tuple: (1, 1) -- worker_id: <ForkProcess(ForkPoolWorker-121, started daemon)>
#input tuple: (2, 4) -- worker_id: <ForkProcess(ForkPoolWorker-123, started daemon)>
#input tuple: (4, 4) -- worker_id: <ForkProcess(ForkPoolWorker-122, started daemon)>
#input tuple: (3, 5) -- worker_id: <ForkProcess(ForkPoolWorker-123, started daemon)>
#input tuple: (2, 3) -- worker_id: <ForkProcess(ForkPoolWorker-121, started daemon)>

这正是您想要的?

您可以将 map()dict.values()

一起使用
import multiprocessing as mp

dict = {
     '0': [(1,1), (2,3)], 
     '1': [(4,4)], 
     '2': [(2,4), (3,5)]
}

def process(data):
    print(f"process data: {data}")
    #return result

with mp.Pool() as p:
    all_results = p.map(process, dict.values())

结果:

process data: [(1, 1), (2, 3)]
process data: [(4, 4)]
process data: [(2, 4), (3, 5)]