在Rancher上部署AirFlow

一、把AirFlow鏡像上傳到Harbor

1.在本地虛擬機把做好的容器導入成鏡像

 docker load -i airflow.tar

2.登錄私有倉庫(其實就是登錄Harbor)

docker login 10.1.119.12 -u admin -p Harbor12345

3.重命名對象

docker tag airflow 10.1.119.12/gx/airflow:1

4.將鏡像推至Harbor上

docker push 10.1.119.12/gx/airflow:1

具體流程可參考 http://172.16.110.7:10080/bigdata_product_system_group/WIKI/src/master/6.%e8%bf%90%e7%bb%b4/1.docker/5.docker%e8%bf%9e%e6%8e%a5%e7%a7%81%e6%9c%8d%ef%bc%8c%e5%b9%b6%e5%90%91%e7%a7%81%e6%9c%8d%e6%8e%a8%e9%80%81%e9%95%9c%e5%83%8f

二、在Rancher上部署AirFlow

1.部署容器 

2.填寫名稱和Docker鏡像地址 

3.配置端口映射和環境變量 

4.主機調度,選擇需要部署在的服務器上 

5.配置數據卷(服務器的目錄和容器目錄的映射,在選擇數據卷類型的時候選擇 映射主機目錄)  

 

6.直接點擊啓動

三、如何使用Rancher上面部署好的AirFlow容器

1.通過遠程連接工具登錄到服務器然後進入容器

docker exec -it -u root 3d3341a22687 bash
#3d3341a22687是容器id

2.初始化airflow

airflow initdb

初始化如果遇到問題

Traceback (most recent call last):
  File "/opt/anaconda3/bin/airflow", line 32, in <module>
    args.func(args)
  File "/opt/anaconda3/lib/python3.7/site-packages/airflow/bin/cli.py", line 1102, in initdb
    db.initdb(settings.RBAC)
  File "/opt/anaconda3/lib/python3.7/site-packages/airflow/utils/db.py", line 106, in initdb
    upgradedb()
  File "/opt/anaconda3/lib/python3.7/site-packages/airflow/utils/db.py", line 377, in upgradedb
    command.upgrade(config, 'heads')
  File "/opt/anaconda3/lib/python3.7/site-packages/alembic/command.py", line 298, in upgrade
    script.run_env()
  File "/opt/anaconda3/lib/python3.7/site-packages/alembic/script/base.py", line 489, in run_env
    util.load_python_file(self.dir, "env.py")
  File "/opt/anaconda3/lib/python3.7/site-packages/alembic/util/pyfiles.py", line 98, in load_python_file
    module = load_module_py(module_id, path)
  File "/opt/anaconda3/lib/python3.7/site-packages/alembic/util/compat.py", line 173, in load_module_py
    spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/opt/anaconda3/lib/python3.7/site-packages/airflow/migrations/env.py", line 92, in <module>
    run_migrations_online()
  File "/opt/anaconda3/lib/python3.7/site-packages/airflow/migrations/env.py", line 86, in run_migrations_online
    context.run_migrations()
  File "<string>", line 8, in run_migrations
  File "/opt/anaconda3/lib/python3.7/site-packages/alembic/runtime/environment.py", line 846, in run_migrations
    self.get_context().run_migrations(**kw)
  File "/opt/anaconda3/lib/python3.7/site-packages/alembic/runtime/migration.py", line 518, in run_migrations
    step.migration_fn(**kw)
  File "/opt/anaconda3/lib/python3.7/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py", line 45, in upgrade
    "Global variable explicit_defaults_for_timestamp needs to be on (1) for mysql"
Exception: Global variable explicit_defaults_for_timestamp needs to be on (1) for mysql

解決方法: 進入mysql airflow 數據庫,設置global explicit_defaults_for_timestamp

mysql>  show global variables like '%timestamp%';
+---------------------------------+-------+
| Variable_name                   | Value |
+---------------------------------+-------+
| explicit_defaults_for_timestamp | OFF   |
| log_timestamps                  | UTC   |
+---------------------------------+-------+
2 rows in set (0.00 sec)

mysql> set global explicit_defaults_for_timestamp =1;
Query OK, 0 rows affected (0.00 sec)

mysql> 

再重新初始化

3.後臺啓動webserver

nohup airflow webserver>>$AIRFLOW_HOME/airflow-webserver.log 2>&1 &

4.後臺啓動scheduler

nohup airflow scheduler>>$AIRFLOW_HOME/airflow-scheduler.log 2>&1 &

5.在瀏覽器打開地址http://10.1.119.32:30011/admin/ http://服務器的IP地址:映射主機的端口號/admin/

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章