一、把AirFlow镜像上传到Harbor
1.在本地虚拟机把做好的容器导入成镜像
docker load -i airflow.tar
2.登录私有仓库(其实就是登录Harbor)
docker login 10.1.119.12 -u admin -p Harbor12345
3.重命名对象
docker tag airflow 10.1.119.12/gx/airflow:1
4.将镜像推至Harbor上
docker push 10.1.119.12/gx/airflow:1
二、在Rancher上部署AirFlow
1.部署容器
2.填写名称和Docker镜像地址
3.配置端口映射和环境变量
4.主机调度,选择需要部署在的服务器上
5.配置数据卷(服务器的目录和容器目录的映射,在选择数据卷类型的时候选择 映射主机目录)
6.直接点击启动
三、如何使用Rancher上面部署好的AirFlow容器
1.通过远程连接工具登录到服务器然后进入容器
docker exec -it -u root 3d3341a22687 bash #3d3341a22687是容器id
2.初始化airflow
airflow initdb
初始化如果遇到问题
Traceback (most recent call last): File "/opt/anaconda3/bin/airflow", line 32, in <module> args.func(args) File "/opt/anaconda3/lib/python3.7/site-packages/airflow/bin/cli.py", line 1102, in initdb db.initdb(settings.RBAC) File "/opt/anaconda3/lib/python3.7/site-packages/airflow/utils/db.py", line 106, in initdb upgradedb() File "/opt/anaconda3/lib/python3.7/site-packages/airflow/utils/db.py", line 377, in upgradedb command.upgrade(config, 'heads') File "/opt/anaconda3/lib/python3.7/site-packages/alembic/command.py", line 298, in upgrade script.run_env() File "/opt/anaconda3/lib/python3.7/site-packages/alembic/script/base.py", line 489, in run_env util.load_python_file(self.dir, "env.py") File "/opt/anaconda3/lib/python3.7/site-packages/alembic/util/pyfiles.py", line 98, in load_python_file module = load_module_py(module_id, path) File "/opt/anaconda3/lib/python3.7/site-packages/alembic/util/compat.py", line 173, in load_module_py spec.loader.exec_module(module) File "<frozen importlib._bootstrap_external>", line 728, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "/opt/anaconda3/lib/python3.7/site-packages/airflow/migrations/env.py", line 92, in <module> run_migrations_online() File "/opt/anaconda3/lib/python3.7/site-packages/airflow/migrations/env.py", line 86, in run_migrations_online context.run_migrations() File "<string>", line 8, in run_migrations File "/opt/anaconda3/lib/python3.7/site-packages/alembic/runtime/environment.py", line 846, in run_migrations self.get_context().run_migrations(**kw) File "/opt/anaconda3/lib/python3.7/site-packages/alembic/runtime/migration.py", line 518, in run_migrations step.migration_fn(**kw) File "/opt/anaconda3/lib/python3.7/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py", line 45, in upgrade "Global variable explicit_defaults_for_timestamp needs to be on (1) for mysql" Exception: Global variable explicit_defaults_for_timestamp needs to be on (1) for mysql
解决方法: 进入mysql airflow 数据库,设置global explicit_defaults_for_timestamp
mysql> show global variables like '%timestamp%';
+---------------------------------+-------+
| Variable_name | Value |
+---------------------------------+-------+
| explicit_defaults_for_timestamp | OFF |
| log_timestamps | UTC |
+---------------------------------+-------+
2 rows in set (0.00 sec)
mysql> set global explicit_defaults_for_timestamp =1;
Query OK, 0 rows affected (0.00 sec)
mysql>
再重新初始化
3.后台启动webserver
nohup airflow webserver>>$AIRFLOW_HOME/airflow-webserver.log 2>&1 &
4.后台启动scheduler
nohup airflow scheduler>>$AIRFLOW_HOME/airflow-scheduler.log 2>&1 &
5.在浏览器打开地址http://10.1.119.32:30011/admin/ http://服务器的IP地址:映射主机的端口号/admin/