Asyncio 流处理数据与两个子进程之间的管道

Asyncio stream process data with pipe between two subprocesses

您好,我找不到此处示例的解决方案。我找到了但是没有流读取。

我想要 运行 两个独立于 python 程序的子进程(运行ning 在后台)。第一个子进程通过管道为第二个进程提供数据,我想通过流方式对标准输出行进行一些处理。

下面的例子被屏蔽了,但我不知道为什么:

import asyncio
import os

async def foo():
    read, write = os.pipe()
    process_1 = await asyncio.create_subprocess_exec('ls', stdout=write)
    process_2 = await asyncio.create_subprocess_exec('wc', stdin=read, stdout=asyncio.subprocess.PIPE)
    
    async for l in process_2.stdout:
        # streaming process data
        print(l)
    
    os.close(write)
    os.close(read)

await foo() # jupyter call
# async.run(foo()) # python call

如果我将 close() 移动到您的 link.

等地方,代码对我有用

但这可能不是您所期望的。

import asyncio
import os

async def foo():
    read, write = os.pipe()
    
    process_1 = await asyncio.create_subprocess_exec('ls', stdout=write)
    os.close(write)

    process_2 = await asyncio.create_subprocess_exec('wc', stdin=read, stdout=asyncio.subprocess.PIPE)
    os.close(read)
    
    async for line in process_2.stdout:
        # streaming process data
        print(line.decode())

#await foo() # jupyter call
asyncio.run(foo()) # python call

最终我可以稍后关闭 read 但我必须在 for-循环之前关闭 write

import asyncio
import os

async def foo():
    read, write = os.pipe()
    
    process_1 = await asyncio.create_subprocess_exec('ls', stdout=write)
    process_2 = await asyncio.create_subprocess_exec('wc', stdin=read, stdout=asyncio.subprocess.PIPE)
    
    os.close(write)

    async for line in process_2.stdout:
        # streaming process data
        print(line.decode())

    os.close(read)

#await foo() # jupyter call
asyncio.run(foo()) # python call