目前有需求是在兩個mysql數據庫間同步數據,採用的方案是先利用maxwell將源數據庫的log_bin日誌傳到kafka,再從kafka消費到目的數據庫,但最近發現日誌中總是報這麼個warn:
[13:43:53:135] [WARN] - com.zaxxer.hikari.pool.PoolBase.isConnectionAlive(PoolBase.java:176) \
- Dataxxxx - Failed to validate connection com.mysql.cj.jdbc.ConnectionImpl@4fda9dd2 \
(No operations allowed after connection closed.). Possibly consider using a shorter maxLifetime value.
其中的Dataxxxx中的xxxx是數據庫端口號。根據日誌中提示,應該是maxlifetime值太大,看項目中配置是默認的180000,而數據庫的interactiv_timeout和wait_timeout都爲30。於是乎,先只修改了maxlifetime。
spring.datasource.hikari.max-lifetime=20
然後重新打包運行,發現還是會報同樣warn,後來參考其他帖子說maxlifetime小於數據庫配置參數timeout應該不少於30,於是再次將以上兩個值改爲300。改後如下:
mysql> show variables like "%timeout%";
+-----------------------------+----------+
| Variable_name | Value |
+-----------------------------+----------+
| connect_timeout | 10 |
| delayed_insert_timeout | 300 |
| have_statement_timeout | YES |
| innodb_flush_log_at_timeout | 1 |
| innodb_lock_wait_timeout | 20 |
| innodb_rollback_on_timeout | OFF |
| interactive_timeout | 300 |
| lock_wait_timeout | 31536000 |
| net_read_timeout | 30 |
| net_write_timeout | 60 |
| rpl_stop_slave_timeout | 31536000 |
| slave_net_timeout | 3600 |
| wait_timeout | 300 |
+-----------------------------+----------+
13 rows in set (0.00 sec)
之後再運行程序就沒問題了。