一、celery的安裝使用
輸入python,然後tab回車,只看到了python,python2,python2.7,沒有python3
1、yum安裝python3,pip3和最新版的Django https://yq.aliyun.com/articles/640213
2、安裝celery
pip3.6 install celery
3、安裝redis
pip3.6 install redis
4、使用celery
結合官方文檔:http://docs.jinkan.org/docs/celery/getting-started/first-steps-with-celery.html
和使用手冊:https://zhuanlan.zhihu.com/p/43588348
二、celery使用實例詳解
1、實例1
//1、創建異步任務:
from celery import Celery
app = Celery('tasks', broker='redis://127.0.0.1', backend='redis://127.0.0.1')
@app.task
def add(x, y):
return x + y
//2、方法調用(生產者生產消息):
from tasks import add #導入我們的任務函數add
import time
result = add.delay(12,12) #異步調用,這一步不會阻塞,程序會立即往下運行
while not result.ready():# 循環檢查任務是否執行完畢
print(time.strftime("%H:%M:%S"))
time.sleep(1)
print(result.get()) #獲取任務的返回結果
print(result.successful()) #判斷任務是否成功執行
//3、遠程調用(生產者生產消息):
from tasks import app
print( app.send_task("tasks.add",args=(11,25)) )
//4、查看任務執行結果(消費者):
from tasks import add
taskid= 'd471f106-c8c9-4770-b0c7-64a1f3194e18'
add.AsyncResult(taskid).get()
//5、問題來了,在一個文件中,先生產後消費OK不?
from tasks import app
from tasks import add
taskid = app.send_task("tasks.add",args=(11,25))
print( add.AsyncResult(taskid).get() )
//答:不可以。生產者與消費者。生產者生產1個返回taskid就返回了,消費者持續異步地消費這個任務。
celery -A tasks worker -l info
這裏,-A 表示我們的程序的模塊名稱,worker 表示啓動一個執行單元,-l 是批 -level,表示打印的日誌級別。
查看celery命令:celery –help
如查看celery worker命令:celery worker –help
2、實例2:多任務的celery項目
目錄如下:
async_myCeleryProj_result.py文件
call_myCeleryProj_tasks.py文件
myCeleryProj文件夾
__init__.py文件
app.py文件
settings.py文件
tasks.py文件
1、__init__.py文件是個空文件,告訴Python myCeleryProj是一個可導入的包
2、app.py文件
from celery import Celery
app = Celery("myCeleryProj", include=["myCeleryProj.tasks"])
app.config_from_object("myCeleryProj.settings")
if __name__ == "__main__":
app.start()
3、settings.py文件
from kombu import Queue
import re
from datetime import timedelta
from celery.schedules import crontab
CELERY_QUEUES = ( # 定義任務隊列
Queue("default", routing_key="task.#"), # 路由鍵以“task.”開頭的消息都進default隊列
Queue("tasks_A", routing_key="A.#"), # 路由鍵以“A.”開頭的消息都進tasks_A隊列
Queue("tasks_B", routing_key="B.#"), # 路由鍵以“B.”開頭的消息都進tasks_B隊列
)
CELERY_TASK_DEFAULT_QUEUE = "default" # 設置默認隊列名爲 default
CELERY_TASK_DEFAULT_EXCHANGE = "tasks"
CELERY_TASK_DEFAULT_EXCHANGE_TYPE = "topic"
CELERY_TASK_DEFAULT_ROUTING_KEY = "task.default"
CELERY_ROUTES = (
[
(
re.compile(r"myCeleryProj\.tasks\.(taskA|taskB)"),
{"queue": "tasks_A", "routing_key": "A.import"},
), # 將tasks模塊中的taskA,taskB分配至隊列 tasks_A ,支持正則表達式
(
"myCeleryProj.tasks.add",
{"queue": "default", "routing_key": "task.default"},
), # 將tasks模塊中的add任務分配至隊列 default
],
)
BROKER_URL = "redis://127.0.0.1:6379/0" # 使用redis 作爲消息代理
CELERY_RESULT_BACKEND = "redis://127.0.0.1:6379/0" # 任務結果存在Redis
CELERY_RESULT_SERIALIZER = "json" # 讀取任務結果一般性能要求不高,所以使用了可讀性更好的JSON
CELERY_TASK_RESULT_EXPIRES = 60 * 60 * 24 # 任務過期時間,不建議直接寫86400,應該讓這樣的magic數字表述更明顯
CELERYBEAT_SCHEDULE = {
"add": {
"task": "myCeleryProj.tasks.add",
"schedule": timedelta(seconds=10),
"args": (10, 16),
},
"taskA": {
"task": "myCeleryProj.tasks.taskA",
"schedule": crontab(hour=21, minute=10),
},
"taskB": {
"task": "myCeleryProj.tasks.taskB",
"schedule": crontab(hour=21, minute=12),
},
}
4、tasks.py文件
import os
from myCeleryProj.app import app
import time
import socket
def get_host_ip():
"""
查詢本機ip地址
:return: ip
"""
try:
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
s.connect(("8.8.8.8", 80))
ip = s.getsockname()[0]
finally:
s.close()
return ip
@app.task
def add(x, y):
s = x + y
time.sleep(3) # 模擬耗時操作
print("主機IP {}: x + y = {}".format(get_host_ip(), s))
return s
@app.task
def taskA():
print("taskA begin...")
print(f"主機IP {get_host_ip()}")
time.sleep(3)
print("taskA done.")
@app.task
def taskB():
print("taskB begin...")
print(f"主機IP {get_host_ip()}")
time.sleep(3)
print("taskB done.")
5、call_myCeleryProj_tasks.py文件
from myCeleryProj.tasks import app
print( app.send_task("myCeleryProj.tasks.add",args=(4,5)) )
print( app.send_task("myCeleryProj.tasks.taskA") )
print( app.send_task("myCeleryProj.tasks.taskB") )
6、async_myCeleryProj_result.py
from myCeleryProj.tasks import add
taskid_add = 'dc3622e6-89bf-48e1-8981-85dbe0bd83c5'
taskid_taskA = 'bf99ed11-8cba-4f46-a74e-bd2fc5902857'
taskid_taskB = '6d681b00-73bb-482a-94ad-40a18387d3ab'
print( add.AsyncResult(taskid_add).get() )
print( add.AsyncResult(taskid_taskA).get() )
print( add.AsyncResult(taskid_taskB).get() )
三、錯誤排查&解決
官方celery文檔:http://docs.jinkan.org/docs/celery/getting-started/first-steps-with-celery.html
from celery import Celery
app = Celery('tasks', broker='redis://10.26.27.85')
@app.task
def add(x, y):
return x + y
執行 celery -A tasks worker --loglevel=info, 報錯如下:
consumer: Cannot connect to redis://10.26.27.85:6379//
(1)首先 redis-cli -h 10.26.27.85 -p 6379
> info
報錯提示受保護,那就到redis配置文件中去掉保護。
cd /usr/local/matrix/etc/redis/
裏面有3個文件,redis6380.conf redis6381.conf redis.conf
其中redis.conf實際就是默認的redis6379的配置文件了,vim redis.conf
查找protected, 定位到這行:protected-mode yes,將yes改爲no
查找daemon,定位到這行:daemonize yes,如果是no就改爲yes,這裏本身就是yes,無需更改
查找bind,定位到這行:bind 127.0.0.1,改爲:bind 0.0.0.0,所有的都可以綁定
(2)重啓redis-server
redis-server /usr/local/matrix/etc/redis/redis.conf
(13)再次執行 celery -A tasks worker --loglevel=info
還是報同樣的錯誤,神奇吧!詭異吧!
(4)查看redis進程
ps -ef|grep -v grep|grep redis
[root@VM redis]# ps -ef|grep -v grep|grep redis
root 1414 1 0 2019 ? 08:09:54 /usr/local/matrix/bin/redis-server 10.26.27.85:6380
root 1419 1 0 2019 ? 07:43:09 /usr/local/matrix/bin/redis-server 10.26.27.85:6381
root 26229 25889 0 15:49 pts/2 00:00:00 redis-cli -h 10.26.27.85 -p 6379
root 27201 1 0 15:52 ? 00:00:00 redis-server 0.0.0.0:6379
可能是6379對應的進程僵死或者重啓沒奏效。把6379的進程(26229和27201)都kill -9殺死,然後重新開啓:redis-server /usr/local/matrix/etc/redis/redis.conf
重新打開,就可以了:[root@VM xiafen]# ps -ef|grep -v grep|grep redis
root 1414 1 0 2019 ? 08:10:00 /usr/local/matrix/bin/redis-server 10.26.27.85:6380
root 1419 1 0 2019 ? 07:43:15 /usr/local/matrix/bin/redis-server 10.26.27.85:6381
root 27201 1 0 15:52 ? 00:00:07 redis-server 0.0.0.0:6379
(5)再次執行 celery -A tasks worker --loglevel=info
就妥妥的了