如果从函数内部执行,带 "apply_async" 的多处理池不执行任何操作

Multiprocessing pool with "apply_async" does nothing if executed from inside a function

我正在尝试使用 multiprocessing module and more partuclarly the Pool.apply_async() 函数。

此代码运行良好:

import multiprocessing

def do():
    print("Foobar", flush=True)

with multiprocessing.Pool(1) as pool:
    for i in range(2):
        pool.apply_async(do)

    pool.close()
    pool.join()

"Foobar" 字符串被打印了两次。

但是,如果我把这段代码放在一个函数中,然后调用这个函数,什么也没有发生。没有报错也没有"Foobar",程序静默结束

import multiprocessing

def test():

    def do():
        print("Foobar", flush=True)

    with multiprocessing.Pool(1) as pool:
        for i in range(5):
            pool.apply_async(do)

        pool.close()
        pool.join()

test()

为什么?我在 Linux.

上使用 Python 3.7.3

为了检索您的计算结果,请对您的代码进行以下更改。

import multiprocessing

def test():

    def do():
        print("Foobar", flush=True)

    with multiprocessing.Pool(1) as pool:
        for i in range(5):
            result = pool.apply_async(do)

            result.get()

        pool.close()
        pool.join()

test()

你会明白 "nothing happens" 的原因。

Traceback (most recent call last):
  File "/tmp/test.py", line 17, in <module>
    test()
  File "/tmp/test.py", line 12, in test
    result.get()
  File "/usr/lib/python3.5/multiprocessing/pool.py", line 608, in get
    raise self._value
  File "/usr/lib/python3.5/multiprocessing/pool.py", line 385, in _handle_tasks
    put(task)
  File "/usr/lib/python3.5/multiprocessing/connection.py", line 206, in send
    self._send_bytes(ForkingPickler.dumps(obj))
  File "/usr/lib/python3.5/multiprocessing/reduction.py", line 50, in dumps
    cls(buf, protocol).dump(obj)
AttributeError: Can't pickle local object 'test.<locals>.do'    

Python multiprocessing.Pool 依赖于 pickle 协议来序列化要发送到其他进程的数据。 pickle 协议只能序列化顶级函数,不能序列化嵌套函数。

查看什么可以腌制,什么不能腌制documentation