Flask celery任务无法正常工作。

6

我参考了这个答案来配置我的项目: 如何在Celery任务中使用Flask-SQLAlchemy

我的extension.py文件:

import flask
from flask.ext.sqlalchemy import SQLAlchemy
from config import BaseConfig
from celery import Celery
from flask_mail import Mail
from celery.schedules import crontab


class FlaskCelery(Celery):

    def __init__(self, *args, **kwargs):

        super(FlaskCelery, self).__init__(*args, **kwargs)
        self.patch_task()

        if 'app' in kwargs:
            self.init_app(kwargs['app'])

    def patch_task(self):
        TaskBase = self.Task
        _celery = self

        class ContextTask(TaskBase):
            abstract = True

            def __call__(self, *args, **kwargs):
                if flask.has_app_context():
                    return TaskBase.__call__(self, *args, **kwargs)
                else:
                    with _celery.app.app_context():
                        return TaskBase.__call__(self, *args, **kwargs)

        self.Task = ContextTask

    def init_app(self, app):
        self.app = app
        self.config_from_object(app.config)


mail = Mail()
db = SQLAlchemy()
settings = BaseConfig()
celery = FlaskCelery()

然后在我的app_settings.py中,我创建了这个应用程序:

app = Flask('app', instance_relative_config=True)

并配置了Celery:

celery.init_app(app)

我使用python manage.py run命令来运行flask项目:

app.run(
        debug=settings.get('DEBUG', False),
        host=settings.get('HOST', '127.0.0.1'),
        port=settings.get('PORT', 5000)
    )

并运行了Celery:

celery -A manage.celery worker --beat -l debug

Celery日志看起来不错:

[tasks]
  . app.api.tasks.spin_file
  . app.main.tasks.send_async_email
  . celery.backend_cleanup
  . celery.chain
  ...

然后在views.py文件中,我调用了这个任务:

send_async_email.delay(*args, **kwargs)

但是 Celery 忽略了所有任务。没有任何响应,没有错误,没有警告。请问我做错了什么?

编辑:当我使用以下命令启动 celery 时:celery -A manage.celery worker --beat -l debug

我收到了以下警告:

[2015-09-21 10:04:32,220: WARNING/MainProcess] /home/.virtualenvs/myproject/local/lib/python2.7/site-packages/celery/app/control.py:36: DuplicateNodenameWarning: Received multiple replies from node name: 'name'.
Please make sure you give each node a unique nodename using the `-n` option.
  pluralize(len(dupes), 'name'), ', '.join(sorted(dupes)),
1个回答

3

我不确定这是否能帮到你,但每当我需要使用celery时,我都会在许多项目中使用此代码:

from flask import Flask, request, jsonify as jsn
from celery import Celery
app = Flask(__name__)
app.config.update(dict(
    SECRET_KEY='blabla'
    )
)
# Celery configuration
app.config['CELERY_BROKER_URL'] = 'redis://localhost:6379/0'
app.config['CELERY_RESULT_BACKEND'] = 'database'
app.config['CELERY_RESULT_DBURI'] = 'sqlite:///temp.db'
app.config['CELERY_TRACK_STARTED'] = True
app.config['CELERY_SEND_EVENTS'] = True

# Initialize Celery
celery = Celery(app.name, broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)

@celery.task
def do_something(data):

    from celery import current_task
    import os
    import subprocess
    with app.app_context():
        #run some bash script with some params in my case

然后,我通过supervisor运行celery:

#!/bin/bash
cd /project/location && . venv/bin/activate && celery worker -A appname.celery --loglevel=info --purge #appname is my main flask file

当然,在我的路由中,我有类似下面的东西:
@app.route('/someroute', methods=["POST"])
def someroute():
    result = do_something.delay(data)
    print result.id

兄弟,我这边什么都不行,试着按照你的方法来做。Celery worker 和 heartbeat 都在运行,但是数据库里面没有任何东西被存储。 - Idris Stack
当我使用上面的示例时,它可以正常工作。然而,如果我在自己的代码中运行它,我需要redis(ImportError:缺少redis库(pip install redis))。为什么会发生这种情况? - Sade

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接