任务收到但不执行
task receive but doesn't excute
我正在学习 Django。最近我试着用芹菜。我面临的问题是任务正在接收但它们不执行。
settings.py:
# Celery application definition
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Asia/Tehran'
celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myshop.settings')
app = Celery('myshop')
app.config_from_object('django.conf.settings', namespace='CELERY')
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
tasks.py:
from celery import task, shared_task
from django.core.mail import send_mail
from .models import Order
import logging
# Get an instance of a logger
logger = logging.getLogger(__name__)
@shared_task()
def order_created(order_id):
logger.error('msg received')
"""
Task to send an e-mail notification when an order is
successfully created.
"""
order = Order.objects.get(id=order_id)
order = Order.objects.get(id=order_id)
subject = f'Order nr. {order.id}'
message = f'Dear {order.first_name},\n\n' \
f'You have successfully placed an order.' \
f'Your order ID is {order.id}.'
mail_sent = send_mail(subject,
message,
'admin@myshop.com',
[order.email])
logger.error('mail sent')
return mail_sent
运行 views.py 中的任务:
# launch asynchronous task
logger.error('before task')
order_created.delay(order.id)
logger.error('after task')
cmd 在 运行 celery 之后用 celery -A myshop worker -l info
:
记录
(venv) E:\Workspace\django-shop\myshop>celery -A myshop worker -l info
-------------- celery@DESKTOP-F7E0RGJ v4.4.2 (cliffs)
--- ***** -----
-- ******* ---- Windows-10-10.0.18362-SP0 2020-06-23 03:37:15
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: myshop:0x4b93df0
- ** ---------- .> transport: redis://localhost:6379//
- ** ---------- .> results: redis://localhost:6379/
- *** --- * --- .> concurrency: 8 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. myshop.celery.debug_task
. orders.tasks.order_created
[2020-06-23 03:37:15,800: INFO/MainProcess] Connected to redis://localhost:6379//
[2020-06-23 03:37:15,823: INFO/MainProcess] mingle: searching for neighbors
[2020-06-23 03:37:16,320: INFO/SpawnPoolWorker-2] child process 15856 calling self.run()
[2020-06-23 03:37:16,328: INFO/SpawnPoolWorker-1] child process 13464 calling self.run()
[2020-06-23 03:37:16,332: INFO/SpawnPoolWorker-4] child process 13528 calling self.run()
[2020-06-23 03:37:16,337: INFO/SpawnPoolWorker-3] child process 4660 calling self.run()
[2020-06-23 03:37:16,344: INFO/SpawnPoolWorker-8] child process 9040 calling self.run()
[2020-06-23 03:37:16,346: INFO/SpawnPoolWorker-5] child process 17044 calling self.run()
[2020-06-23 03:37:16,358: INFO/SpawnPoolWorker-7] child process 16480 calling self.run()
[2020-06-23 03:37:16,366: INFO/SpawnPoolWorker-6] child process 16876 calling self.run()
[2020-06-23 03:37:16,885: INFO/MainProcess] mingle: all alone
[2020-06-23 03:37:16,913: WARNING/MainProcess] e:\workspace\django-shop\venv\lib\site-packages\celery\fixups\django.py:202: UserWarning: Using settings.DEBUG leads to a memory
leak, never use this setting in production environments!
warnings.warn('''Using settings.DEBUG leads to a memory
[2020-06-23 03:37:16,914: INFO/MainProcess] celery@DESKTOP-F7E0RGJ ready.
[2020-06-23 03:37:16,918: INFO/MainProcess] Received task: orders.tasks.order_created[1d366fb3-262b-4d28-92ab-8a7b85f1deff]
[2020-06-23 03:37:16,921: INFO/MainProcess] Received task: orders.tasks.order_created[a1d79ddb-eb2f-4791-9e8a-f1a71bf5f44e]
[2020-06-23 03:37:16,924: INFO/MainProcess] Received task: orders.tasks.order_created[afb84e39-3d10-42d0-ab9e-333917ff66f9]
[2020-06-23 03:37:16,927: INFO/MainProcess] Received task: orders.tasks.order_created[86dc7ad7-66e6-4f3a-9db1-6f2a5cb5791f]
[2020-06-23 03:37:16,931: INFO/MainProcess] Received task: myshop.celery.debug_task[8169ff3e-ceae-478f-87ab-4020bedb8338]
[2020-06-23 03:37:16,935: INFO/MainProcess] Received task: orders.tasks.order_created[e689b21e-78b6-498b-adb4-f64dedbbbac8]
[2020-06-23 03:37:16,938: INFO/MainProcess] Received task: orders.tasks.order_created[92db7aa9-bf16-4be1-9769-a1986ffc6f6d]
[2020-06-23 03:37:16,940: INFO/MainProcess] Received task: orders.tasks.order_created[5c0e9873-2671-49a6-9c53-db6dfdd041ef]
[2020-06-23 03:37:16,943: INFO/MainProcess] Received task: orders.tasks.order_created[8d17faa5-19d7-49ed-8181-5a334423c1bb]
[2020-06-23 03:37:18,420: INFO/MainProcess] Received task: orders.tasks.order_created[4671d8c9-11dd-4c9a-a7b1-d0cfedab5e0a]
[2020-06-23 03:37:18,443: INFO/MainProcess] Received task: orders.tasks.order_created[c8f588c4-6e21-4fc8-b441-b65456207cd6]
[2020-06-23 03:37:18,453: INFO/MainProcess] Received task: orders.tasks.order_created[509fd1ad-eac9-4aab-b1f1-1c6278ebbffc]
[2020-06-23 03:37:19,080: INFO/SpawnPoolWorker-9] child process 10652 calling self.run()
[2020-06-23 03:37:19,095: INFO/SpawnPoolWorker-12] child process 13548 calling self.run()
[2020-06-23 03:37:19,097: INFO/SpawnPoolWorker-11] child process 252 calling self.run()
[2020-06-23 03:37:19,112: INFO/SpawnPoolWorker-13] child process 4604 calling self.run()
[2020-06-23 03:37:19,119: INFO/SpawnPoolWorker-14] child process 12696 calling self.run()
[2020-06-23 03:37:19,121: INFO/SpawnPoolWorker-16] child process 13080 calling self.run()
[2020-06-23 03:37:19,125: INFO/SpawnPoolWorker-10] child process 12420 calling self.run()
[2020-06-23 03:37:19,159: INFO/SpawnPoolWorker-15] child process 4468 calling self.run()
[2020-06-23 03:37:21,177: INFO/SpawnPoolWorker-20] child process 14032 calling self.run()
[2020-06-23 03:37:21,183: INFO/SpawnPoolWorker-19] child process 11220 calling self.run()
[2020-06-23 03:37:21,183: INFO/SpawnPoolWorker-18] child process 1084 calling self.run()
[2020-06-23 03:37:21,191: INFO/SpawnPoolWorker-17] child process 14020 calling self.run()
并在创建新任务后登录:
[2020-06-23 03:41:32,735: INFO/MainProcess] Received task: orders.tasks.order_created[758b7636-314b-40df-8458-d03bb1ca2ac2]
[2020-06-23 03:41:33,536: INFO/SpawnPoolWorker-22] child process 16052 calling self.run()
[2020-06-23 03:41:33,537: INFO/SpawnPoolWorker-21] child process 9664 calling self.run()
[2020-06-23 03:41:33,538: INFO/SpawnPoolWorker-24] child process 18172 calling self.run()
[2020-06-23 03:41:33,546: INFO/SpawnPoolWorker-26] child process 9704 calling self.run()
[2020-06-23 03:41:33,547: INFO/SpawnPoolWorker-23] child process 10612 calling self.run()
[2020-06-23 03:41:33,547: INFO/SpawnPoolWorker-25] child process 11176 calling self.run()
根据日志,我假设您在 Windows。但是,由于 Celery 4.x Windows is no longer supported. Problems you are encountering are also described in this Celery issue and this . The general discussion about running Celery 4.x on Windows can be found here. The solution to your problem seems to be to use e.g. gevent execution pool (more on execution pools can be found here)。使用 Django 时:
python manage.py celery worker -P gevent
希望这能帮助我也遇到这个问题的人,我使用最新版本的 Django 和 Celery。
事实上,由于不支持版本 4 的 celery Windows 来解决这个问题,你需要另一个像 Gevent 这样的库更多信息
Windows不支持celery 4.0+,
所以要解决这个问题:-
我们必须使用 pip 安装 gevent :-
pip 安装 gevent
然后到运行芹菜,
celery -A worker -l info -P gevent
我正在学习 Django。最近我试着用芹菜。我面临的问题是任务正在接收但它们不执行。
settings.py:
# Celery application definition
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Asia/Tehran'
celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myshop.settings')
app = Celery('myshop')
app.config_from_object('django.conf.settings', namespace='CELERY')
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
tasks.py:
from celery import task, shared_task
from django.core.mail import send_mail
from .models import Order
import logging
# Get an instance of a logger
logger = logging.getLogger(__name__)
@shared_task()
def order_created(order_id):
logger.error('msg received')
"""
Task to send an e-mail notification when an order is
successfully created.
"""
order = Order.objects.get(id=order_id)
order = Order.objects.get(id=order_id)
subject = f'Order nr. {order.id}'
message = f'Dear {order.first_name},\n\n' \
f'You have successfully placed an order.' \
f'Your order ID is {order.id}.'
mail_sent = send_mail(subject,
message,
'admin@myshop.com',
[order.email])
logger.error('mail sent')
return mail_sent
运行 views.py 中的任务:
# launch asynchronous task
logger.error('before task')
order_created.delay(order.id)
logger.error('after task')
cmd 在 运行 celery 之后用 celery -A myshop worker -l info
:
(venv) E:\Workspace\django-shop\myshop>celery -A myshop worker -l info
-------------- celery@DESKTOP-F7E0RGJ v4.4.2 (cliffs)
--- ***** -----
-- ******* ---- Windows-10-10.0.18362-SP0 2020-06-23 03:37:15
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: myshop:0x4b93df0
- ** ---------- .> transport: redis://localhost:6379//
- ** ---------- .> results: redis://localhost:6379/
- *** --- * --- .> concurrency: 8 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. myshop.celery.debug_task
. orders.tasks.order_created
[2020-06-23 03:37:15,800: INFO/MainProcess] Connected to redis://localhost:6379//
[2020-06-23 03:37:15,823: INFO/MainProcess] mingle: searching for neighbors
[2020-06-23 03:37:16,320: INFO/SpawnPoolWorker-2] child process 15856 calling self.run()
[2020-06-23 03:37:16,328: INFO/SpawnPoolWorker-1] child process 13464 calling self.run()
[2020-06-23 03:37:16,332: INFO/SpawnPoolWorker-4] child process 13528 calling self.run()
[2020-06-23 03:37:16,337: INFO/SpawnPoolWorker-3] child process 4660 calling self.run()
[2020-06-23 03:37:16,344: INFO/SpawnPoolWorker-8] child process 9040 calling self.run()
[2020-06-23 03:37:16,346: INFO/SpawnPoolWorker-5] child process 17044 calling self.run()
[2020-06-23 03:37:16,358: INFO/SpawnPoolWorker-7] child process 16480 calling self.run()
[2020-06-23 03:37:16,366: INFO/SpawnPoolWorker-6] child process 16876 calling self.run()
[2020-06-23 03:37:16,885: INFO/MainProcess] mingle: all alone
[2020-06-23 03:37:16,913: WARNING/MainProcess] e:\workspace\django-shop\venv\lib\site-packages\celery\fixups\django.py:202: UserWarning: Using settings.DEBUG leads to a memory
leak, never use this setting in production environments!
warnings.warn('''Using settings.DEBUG leads to a memory
[2020-06-23 03:37:16,914: INFO/MainProcess] celery@DESKTOP-F7E0RGJ ready.
[2020-06-23 03:37:16,918: INFO/MainProcess] Received task: orders.tasks.order_created[1d366fb3-262b-4d28-92ab-8a7b85f1deff]
[2020-06-23 03:37:16,921: INFO/MainProcess] Received task: orders.tasks.order_created[a1d79ddb-eb2f-4791-9e8a-f1a71bf5f44e]
[2020-06-23 03:37:16,924: INFO/MainProcess] Received task: orders.tasks.order_created[afb84e39-3d10-42d0-ab9e-333917ff66f9]
[2020-06-23 03:37:16,927: INFO/MainProcess] Received task: orders.tasks.order_created[86dc7ad7-66e6-4f3a-9db1-6f2a5cb5791f]
[2020-06-23 03:37:16,931: INFO/MainProcess] Received task: myshop.celery.debug_task[8169ff3e-ceae-478f-87ab-4020bedb8338]
[2020-06-23 03:37:16,935: INFO/MainProcess] Received task: orders.tasks.order_created[e689b21e-78b6-498b-adb4-f64dedbbbac8]
[2020-06-23 03:37:16,938: INFO/MainProcess] Received task: orders.tasks.order_created[92db7aa9-bf16-4be1-9769-a1986ffc6f6d]
[2020-06-23 03:37:16,940: INFO/MainProcess] Received task: orders.tasks.order_created[5c0e9873-2671-49a6-9c53-db6dfdd041ef]
[2020-06-23 03:37:16,943: INFO/MainProcess] Received task: orders.tasks.order_created[8d17faa5-19d7-49ed-8181-5a334423c1bb]
[2020-06-23 03:37:18,420: INFO/MainProcess] Received task: orders.tasks.order_created[4671d8c9-11dd-4c9a-a7b1-d0cfedab5e0a]
[2020-06-23 03:37:18,443: INFO/MainProcess] Received task: orders.tasks.order_created[c8f588c4-6e21-4fc8-b441-b65456207cd6]
[2020-06-23 03:37:18,453: INFO/MainProcess] Received task: orders.tasks.order_created[509fd1ad-eac9-4aab-b1f1-1c6278ebbffc]
[2020-06-23 03:37:19,080: INFO/SpawnPoolWorker-9] child process 10652 calling self.run()
[2020-06-23 03:37:19,095: INFO/SpawnPoolWorker-12] child process 13548 calling self.run()
[2020-06-23 03:37:19,097: INFO/SpawnPoolWorker-11] child process 252 calling self.run()
[2020-06-23 03:37:19,112: INFO/SpawnPoolWorker-13] child process 4604 calling self.run()
[2020-06-23 03:37:19,119: INFO/SpawnPoolWorker-14] child process 12696 calling self.run()
[2020-06-23 03:37:19,121: INFO/SpawnPoolWorker-16] child process 13080 calling self.run()
[2020-06-23 03:37:19,125: INFO/SpawnPoolWorker-10] child process 12420 calling self.run()
[2020-06-23 03:37:19,159: INFO/SpawnPoolWorker-15] child process 4468 calling self.run()
[2020-06-23 03:37:21,177: INFO/SpawnPoolWorker-20] child process 14032 calling self.run()
[2020-06-23 03:37:21,183: INFO/SpawnPoolWorker-19] child process 11220 calling self.run()
[2020-06-23 03:37:21,183: INFO/SpawnPoolWorker-18] child process 1084 calling self.run()
[2020-06-23 03:37:21,191: INFO/SpawnPoolWorker-17] child process 14020 calling self.run()
并在创建新任务后登录:
[2020-06-23 03:41:32,735: INFO/MainProcess] Received task: orders.tasks.order_created[758b7636-314b-40df-8458-d03bb1ca2ac2]
[2020-06-23 03:41:33,536: INFO/SpawnPoolWorker-22] child process 16052 calling self.run()
[2020-06-23 03:41:33,537: INFO/SpawnPoolWorker-21] child process 9664 calling self.run()
[2020-06-23 03:41:33,538: INFO/SpawnPoolWorker-24] child process 18172 calling self.run()
[2020-06-23 03:41:33,546: INFO/SpawnPoolWorker-26] child process 9704 calling self.run()
[2020-06-23 03:41:33,547: INFO/SpawnPoolWorker-23] child process 10612 calling self.run()
[2020-06-23 03:41:33,547: INFO/SpawnPoolWorker-25] child process 11176 calling self.run()
根据日志,我假设您在 Windows。但是,由于 Celery 4.x Windows is no longer supported. Problems you are encountering are also described in this Celery issue and this
python manage.py celery worker -P gevent
希望这能帮助我也遇到这个问题的人,我使用最新版本的 Django 和 Celery。
事实上,由于不支持版本 4 的 celery Windows 来解决这个问题,你需要另一个像 Gevent 这样的库更多信息
Windows不支持celery 4.0+, 所以要解决这个问题:-
我们必须使用 pip 安装 gevent :-
pip 安装 gevent
然后到运行芹菜,
celery -A
worker -l info -P gevent