Queue.join() 不解锁
Queue.join() doesn't unblock
我正在尝试编写一个 Python 脚本来并行抓取网站。我制作了一个可以让我爬到深度的原型。
但是,join()
似乎不起作用,我也不知道为什么。
这是我的代码:
from threading import Thread
import Queue
import urllib2
import re
from BeautifulSoup import *
from urlparse import urljoin
def doWork():
while True:
try:
myUrl = q_start.get(False)
except:
continue
try:
c=urllib2.urlopen(myUrl)
except:
continue
soup = BeautifulSoup(c.read())
links = soup('a')
for link in links:
if('href' in dict(link.attrs)):
url = urljoin(myUrl,link['href'])
if url.find("'")!=-1: continue
url=url.split('#')[0]
if url[0:4] == 'http':
print url
q_new.put(url)
q_start = Queue.Queue()
q_new = Queue.Queue()
for i in range(20):
t = Thread(target=doWork)
t.daemon = True
t.start()
q_start.put("http://google.com")
print "loading"
q_start.join()
print "end"
join()
will block until task_done()
has been called as many times as items have been enqueued.
您没有调用 task_done()
,因此 join()
阻塞。在您提供的代码中,调用它的正确位置是在 doWork
循环的最后:
def doWork():
while True:
task = start_q.get(False)
...
for subtask in processed(task):
...
start_q.task_done() # tell the producer we completed a task
我正在尝试编写一个 Python 脚本来并行抓取网站。我制作了一个可以让我爬到深度的原型。
但是,join()
似乎不起作用,我也不知道为什么。
这是我的代码:
from threading import Thread
import Queue
import urllib2
import re
from BeautifulSoup import *
from urlparse import urljoin
def doWork():
while True:
try:
myUrl = q_start.get(False)
except:
continue
try:
c=urllib2.urlopen(myUrl)
except:
continue
soup = BeautifulSoup(c.read())
links = soup('a')
for link in links:
if('href' in dict(link.attrs)):
url = urljoin(myUrl,link['href'])
if url.find("'")!=-1: continue
url=url.split('#')[0]
if url[0:4] == 'http':
print url
q_new.put(url)
q_start = Queue.Queue()
q_new = Queue.Queue()
for i in range(20):
t = Thread(target=doWork)
t.daemon = True
t.start()
q_start.put("http://google.com")
print "loading"
q_start.join()
print "end"
join()
will block until task_done()
has been called as many times as items have been enqueued.
您没有调用 task_done()
,因此 join()
阻塞。在您提供的代码中,调用它的正确位置是在 doWork
循环的最后:
def doWork():
while True:
task = start_q.get(False)
...
for subtask in processed(task):
...
start_q.task_done() # tell the producer we completed a task