安装flink1.9 start-cluster
编写test-env.yaml (一定注意格式和缩进! 对照 https://github.com/apache/flink/blob/release-1.9/flink-table/flink-sql-client/src/test/resources/test-sql-client-defaults.yaml 自己调整或者直接覆盖)
具体配置项:https://ci.apache.org/projects/flink/flink-docs-release-1.9/dev/table/sqlClient.html
tables:
- name: MyTableSource
type: source-table
update-mode: append
connector:
type: filesystem
path: "/tmp/input.csv"
format:
type: csv
fields:
- name: MyField1
type: INT
- name: MyField2
type: VARCHAR
line-delimiter: "\n"
comment-prefix: "#"
schema:
- name: MyField1
type: INT
- name: MyField2
type: VARCHAR
- name: MyTableSink
type: sink-table
update-mode: append
connector:
type: filesystem
path: "/tmp/ouput.csv"
format:
type: csv
fields:
- name: MyField1
type: INT
- name: MyField2
type: VARCHAR
schema:
- name: MyField1
type: INT
- name: MyField2
type: VARCHAR
- name: MyCustomView
type: view
query: "SELECT MyField2 FROM MyTableSource"
execution:
planner: old
type: streaming
time-characteristic: event-time
periodic-watermarks-interval: 99
parallelism: 1
max-parallelism: 16
min-idle-state-retention: 0
max-idle-state-retention: 0
result-mode: table
max-table-result-rows: 100
restart-strategy:
type: failure-rate
max-failures-per-interval: 10
failure-rate-interval: 99000
delay: 1000
configuration:
table.optimizer.join-reorder-enabled: false
deployment:
response-timeout: 5000
=============================================
cat /tmp/input.csv 文件内容如下
1,hello
2,world
3,hello world
1,ok
3,bye bye
4,yes
启动 ./bin/sql-client.sh embedded -e conf/env.yaml
就可以看见表了