无法使用 Celery + Django + Gunicorn 运行 后台任务
Unable to run background tasks with Celery + Django + Gunicorn
堆栈:
Python v2.7
Django v1.11
Celery v4.3.0
Gunicorn v19.7.1
Nginx v1.10
When I try to run django server and celery manually the async tasks executes as expected.
The problem comes when I am deploying django
project using Gunicorn
plus Nginx
.
I tried running Celery
using supervisor
but it didn't help.
views.py
def _functionA():
_functionB.delay() #where _functionB is registered async task.
settings.py
# Celery settings
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
celery_init.py
from __future__ import absolute_import
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'cpi_server.settings')
app = Celery('myproject')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
__init__.py
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from myproject.celery_init import app as celery_app
__all__ = ['celery_app']
gunicorn.service
[Unit]
Description=Gunicorn application server....
After=network.target
[Service]
User=root
Group=www-data
WorkingDirectory=<myprojectdir>
Environment=PYTHONPATH=<ENV>
ExecStart=<myprojectdir>/env/bin/gunicorn --workers 3 --access-logfile access_gunicorn.log --error-logfile error_gunicorn.log --capture-output --log-level debug --bind unix:<myprojectdir>/myproject.sock <myproject>.wsgi:application
[Install]
WantedBy=multi-user.target
myproject_nginx.conf
server {
listen 8001;
location / {
include proxy_params;
proxy_pass http://unix:<myprojectdir>/myproject.sock;
}
}
celery worker
celery worker -B -l info -A myproject -Q celery,queue1,queue2,queue3 -n beat.%h -c 1
谁能帮我解决以下问题:
Why is that when Django is deployed using Gunicorn and nginx the Celery worker doesn't executes tasks whereas when ran manually it is able to execute the tasks i.e. when ran with python manage.py runserver ...
.
您的并发级别等于 1(工作命令行中的 -c 1
)。这基本上意味着工作人员在任何时间点都配置为 运行 单个任务。如果您的任务是 long-running,那么您可能认为 Celery 没有 运行ning 任何东西...
您可以轻松地对此进行测试 - 当您开始某些任务时,运行 如下:
celery -A myproject inspect active
这将列出您 运行ning 任务(如果有的话)。
另一件需要修复的事情是你的配置变量。 Celery 4 现在期望所有配置变量都是 lower-case。阅读 What’s new in Celery 4.0 (latentcall) document for more information, especially the Lowercase setting names 部分。
堆栈:
Python v2.7
Django v1.11
Celery v4.3.0
Gunicorn v19.7.1
Nginx v1.10
When I try to run django server and celery manually the async tasks executes as expected.
The problem comes when I am deploying
django
project usingGunicorn
plusNginx
. I tried runningCelery
usingsupervisor
but it didn't help.
views.py
def _functionA():
_functionB.delay() #where _functionB is registered async task.
settings.py
# Celery settings
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
celery_init.py
from __future__ import absolute_import
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'cpi_server.settings')
app = Celery('myproject')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
__init__.py
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from myproject.celery_init import app as celery_app
__all__ = ['celery_app']
gunicorn.service
[Unit]
Description=Gunicorn application server....
After=network.target
[Service]
User=root
Group=www-data
WorkingDirectory=<myprojectdir>
Environment=PYTHONPATH=<ENV>
ExecStart=<myprojectdir>/env/bin/gunicorn --workers 3 --access-logfile access_gunicorn.log --error-logfile error_gunicorn.log --capture-output --log-level debug --bind unix:<myprojectdir>/myproject.sock <myproject>.wsgi:application
[Install]
WantedBy=multi-user.target
myproject_nginx.conf
server {
listen 8001;
location / {
include proxy_params;
proxy_pass http://unix:<myprojectdir>/myproject.sock;
}
}
celery worker
celery worker -B -l info -A myproject -Q celery,queue1,queue2,queue3 -n beat.%h -c 1
谁能帮我解决以下问题:
Why is that when Django is deployed using Gunicorn and nginx the Celery worker doesn't executes tasks whereas when ran manually it is able to execute the tasks i.e. when ran with
python manage.py runserver ...
.
您的并发级别等于 1(工作命令行中的 -c 1
)。这基本上意味着工作人员在任何时间点都配置为 运行 单个任务。如果您的任务是 long-running,那么您可能认为 Celery 没有 运行ning 任何东西...
您可以轻松地对此进行测试 - 当您开始某些任务时,运行 如下:
celery -A myproject inspect active
这将列出您 运行ning 任务(如果有的话)。
另一件需要修复的事情是你的配置变量。 Celery 4 现在期望所有配置变量都是 lower-case。阅读 What’s new in Celery 4.0 (latentcall) document for more information, especially the Lowercase setting names 部分。