使用 aiohttp.ClientSession 的异步任务
asyncio tasks using aiohttp.ClientSession
我正在使用 python 3.7 并尝试制作一个可以异步访问多个域的爬虫。我正在使用此 asyncio 和 aiohttp,但我遇到了 aiohttp.ClientSession 的问题。这是我的简化代码:
import aiohttp
import asyncio
async def fetch(session, url):
async with session.get(url) as response:
print(await response.text())
async def main():
loop = asyncio.get_event_loop()
async with aiohttp.ClientSession(loop=loop) as session:
cwlist = [loop.create_task(fetch(session, url)) for url in ['http://python.org', 'http://google.com']]
asyncio.gather(*cwlist)
if __name__ == "__main__":
asyncio.run(main())
抛出的异常是这样的:
_GatheringFuture exception was never retrieved
future: <_GatheringFuture finished exception=RuntimeError('Session is closed')>
我做错了什么?
您忘记了 await
asyncio.gather
结果:
async with aiohttp.ClientSession(loop=loop) as session:
cwlist = [loop.create_task(fetch(session, url)) for url in ['http://python.org', 'http://google.com']]
await asyncio.gather(*cwlist)
如果您的 async with
不包含 await
表达式,您应该非常怀疑。
我正在使用 python 3.7 并尝试制作一个可以异步访问多个域的爬虫。我正在使用此 asyncio 和 aiohttp,但我遇到了 aiohttp.ClientSession 的问题。这是我的简化代码:
import aiohttp
import asyncio
async def fetch(session, url):
async with session.get(url) as response:
print(await response.text())
async def main():
loop = asyncio.get_event_loop()
async with aiohttp.ClientSession(loop=loop) as session:
cwlist = [loop.create_task(fetch(session, url)) for url in ['http://python.org', 'http://google.com']]
asyncio.gather(*cwlist)
if __name__ == "__main__":
asyncio.run(main())
抛出的异常是这样的:
_GatheringFuture exception was never retrieved future: <_GatheringFuture finished exception=RuntimeError('Session is closed')>
我做错了什么?
您忘记了 await
asyncio.gather
结果:
async with aiohttp.ClientSession(loop=loop) as session:
cwlist = [loop.create_task(fetch(session, url)) for url in ['http://python.org', 'http://google.com']]
await asyncio.gather(*cwlist)
如果您的 async with
不包含 await
表达式,您应该非常怀疑。