使用sqoop將數據定時從hive表導入MySQL

使用sqoop將數據定時從hive表導入MySQL。

話不多說,線上腳本:

#!/bin/sh
#********************************************************************************
#********************************************************************************
#reload env

#當前目錄路徑
mysqldhome_hostname="172.28.65.133"
mysqldhome_username="root"
mysqldhome_password="xxxxx"
mysqldhome_port="3306"
mysqldhome_dbname="recom_video"

#hive表名
hive_table1=t_dwa_hot_result_ad
#mysql表名
mysql_table=t_st_video_hot_ad
echo "......mysql_table = "$mysql_table".......hive_table1 = "$hive_table1

# 取得系統當前時間
sys_time=$(date "+%Y%m%d%H%M")
echo "...sys_time = "$sys_time

# 取得輸入日期
#op_day=${last_day}
#echo "op_day="$op_day
#dt_month=`date | awk '{print $2}'`  #當期日期月份
#op_day="20200611"
op_day=`date -d '-1 days' +%Y%m%d`

# 清mysql
mysql -h$mysqldhome_hostname -P$mysqldhome_port -u$mysqldhome_username -p$mysqldhome_password -D$mysqldhome_dbname -e "
DELETE FROM $mysql_table;
"
echo "delete mysql data success...."

touch ${mysql_table}_temp_$op_day
/usr/bin/hive -e "
use recommend_video;
set hive.execution.engine=mr;
set hive.fetch.task.conversion=none;
select
  cast(video_id as BIGINT) as video_id,
  cast(watch_user as int) as watch_user,
  cast(max_watch_user as int) as max_watch_user,
  cast(coalesce(sum_finish_score ,'0.0') as decimal(15,4)) as sum_finish_score,
  cast(cnt_finish_score as int) as cnt_finish_score,
  cast(coalesce(favor_cnt,'0') as int) as favor_cnt,
  cast(coalesce(max_favor_cnt,'0') as int) as max_favor_cnt,
  cast(coalesce(hot_score,'0.0') as decimal(15,4)) as hot_score,
  create_time as shelf_time,
  from_unixtime(unix_timestamp()) as create_time,
  from_unixtime(unix_timestamp()) as update_time
from ${hive_table1} 
where dayid = "$op_day";" | grep -vi '^warn\|data' >  ${mysql_table}_temp_$op_day

echo "...select hive data success.... save in file:"${mysql_table}_temp_$op_day
echo "...dayid = "$op_day" "

/usr/bin/hadoop fs -put ${mysql_table}_temp_$op_day /tmp/
rm -rf ${mysql_table}_temp_$op_day
echo "put file to hdfs, and del file in local path..."

hdfs_temp='/tmp/'$mysql_table'_temp_'$op_day
encoding="?useUnicode=true&characterEncoding=utf-8"

echo "...sqoop script start..."
/usr/bin/sqoop export --connect jdbc:mysql://$mysqldhome_hostname:$mysqldhome_port/$mysqldhome_dbname$encoding \
--username $mysqldhome_username \
--password $mysqldhome_password \
--table $mysql_table \
--fields-terminated-by '\t' \
--input-fields-terminated-by '\t' \
--export-dir $hdfs_temp \
--columns video_id,watch_user,max_watch_user,sum_finish_score,cnt_finish_score,favor_cnt,max_favor_cnt,hot_score,shelf_time,create_time,update_time
echo "...sqoop script end..."

/usr/bin/hadoop fs -rm -r $hdfs_temp
echo "rm hdfs tmp file..."
echo "......task end......"

exit 0

 

 

注意事項:

1、請各個執行命令前使用全路徑!

如,/usr/bin/hive -e,/usr/bin/hadoop fs,/usr/bin/sqoop等。

PS:該腳本是將hive數據導到本地,在導入hdfs文件上面,再通過sqoop插入MySQL。參考網上的方法,應該可以直接將hive表中的數據讀取導入到mysql,有興趣的同學可以嘗試下。

附上MySQL數據表:

CREATE TABLE `t_st_video_hot_ad` (
  `id` bigint(20) NOT NULL AUTO_INCREMENT COMMENT '自增ID,主鍵',
  `video_id` bigint(20) NOT NULL COMMENT '視頻ID',
  `watch_user` bigint(15) NOT NULL COMMENT '當前視頻觀看次數',
  `max_watch_user` bigint(15) NOT NULL COMMENT '最大視頻觀看次數',
  `sum_finish_score` decimal(10,4) NOT NULL COMMENT '完播率分子/加和',
  `cnt_finish_score` bigint(15) NOT NULL COMMENT '完播率分母/次數',
  `favor_cnt` bigint(15) NOT NULL COMMENT '當前視頻點贊次數',
  `max_favor_cnt` bigint(15) NOT NULL COMMENT '最大視頻點贊次數',
  `hot_score` decimal(8,4) DEFAULT NULL COMMENT '熱度得分',
  `shelf_time` varchar(50) NOT NULL COMMENT '上架時間',
  `create_time` datetime NOT NULL COMMENT '創建時間',
  `update_time` datetime DEFAULT NULL COMMENT '更新時間',
  PRIMARY KEY (`id`),
  KEY `idx_video_id` (`video_id`)
) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8;

添加crontab定時任務腳本:

00 14 * * * sh /data/xh/test/t_st_video_hot_ad.sh >> /data/xh/test/video_hot_log.log 2>&1

 

貼一下調測過程中遇到的錯誤:

1、JAVA_HOME環境變量找不到:

解決辦法:將變量重新在/etc/hadoop/conf/hadoop-env.sh中聲明一下。

如:export JAVA_HOME=/data/jdk1.8.0_171

 

2、Export job failed。。。被這個問題困了好久。

其實通過查看Hadoop的yarn日誌,可以很容易定位到錯誤原因。

 

通過查看中間導出的數據發現,多了一行字符串文字:

解決辦法:執行hive,hdfs等命令請使用全路徑!上面已提到!

 

完結!!!

 

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章