hana数据库通过sqoop实现hdfs/hive表数据迁移
demo(要先在hive建表)
导入hive表
sqoop import \
-Dhadoop.security.credential.provider.path=jceks://hdfs/user/password/hanadb_password.jceks \
--connect "jdbc:sap://10.1.88.88:30088/" \
--driver com.sap.db.jdbc.Driver \
--username root\
--password-alias hanadb_password.alias \
--hive-overwrite \
--table t_agent \
--hive-import \
--hive-table dw.agent \
--num-mappers 1 \
--hive-delims-replacement '<br/>' \
--hive-partition-key 'dt' \
--hive-partition-value "2021-11-24" \
--null-string '\\N' --null-non-string '\\N' \
--fields-terminated-by '\001' \
-- --schema SAPDEV \
--delete-target-dir \
--target-dir "/user/root/tmp/sqoop/dw/t_agent " \
--relaxed-isolation ;
从hdfs导入到hana:
sqoop export \
--batch \
--connect "jdbc:sap://xx.xx.xx.xx:3xx15/" \
--driver com.sap.db.jdbc.Driver \
--username XXXX \
--password XXXX \
--table "SAPDEV.T_AGENT" \
--export-dir "/xxx/xxx" \
--input-fields-terminated-by "\001" \
--input-null-string NULL
post SAP-Garson
原文链接:https://blog.csdn.net/u013039401/article/details/121510144文章来自于网络,如果侵犯了您的权益,请联系站长删除!