日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問(wèn) 生活随笔!

生活随笔

當(dāng)前位置: 首頁(yè) > 运维知识 > 数据库 >内容正文

数据库

Sqoop2入门之导入关系型数据库数据到HDFS上(sqoop2-1.99.4版本)

發(fā)布時(shí)間:2025/5/22 数据库 18 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Sqoop2入门之导入关系型数据库数据到HDFS上(sqoop2-1.99.4版本) 小編覺(jué)得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.

sqoop2-1.99.4和sqoop2-1.99.3版本操作略有不同:新版本中使用link代替了老版本的connection,其他使用類似。

sqoop2-1.99.4環(huán)境搭建參見(jiàn):Sqoop2環(huán)境搭建

sqoop2-1.99.3版本實(shí)現(xiàn)參見(jiàn):Sqoop2入門之導(dǎo)入關(guān)系型數(shù)據(jù)庫(kù)數(shù)據(jù)到HDFS上

?

啟動(dòng)sqoop2-1.99.4版本客戶端:

$SQOOP2_HOME/bin/sqoop.sh client set server --host hadoop000 --port 12000 --webapp sqoop

?

查看所有connector:

show connector --all 2 connector(s) to show: Connector with id 1:Name: hdfs-connector Class: org.apache.sqoop.connector.hdfs.HdfsConnectorVersion: 1.99.4-cdh5.3.0Connector with id 2:Name: generic-jdbc-connector Class: org.apache.sqoop.connector.jdbc.GenericJdbcConnectorVersion: 1.99.4-cdh5.3.0

?

查詢所有l(wèi)ink:?

show link

刪除指定link:

delete link --lid x

?

查詢所有job:

show job

?

刪除指定job:

delete job --jid 1

??

創(chuàng)建generic-jdbc-connector類型的connector

create link --cid 2Name: First LinkJDBC Driver Class: com.mysql.jdbc.DriverJDBC Connection String: jdbc:mysql://hadoop000:3306/hive Username: rootPassword: ****JDBC Connection Properties: There are currently 0 values in the map:entry# protocol=tcpThere are currently 1 values in the map:protocol = tcpentry# New link was successfully created with validation status OK and persistent id 3

?

show link +----+-------------+-----------+---------+ | Id | Name | Connector | Enabled | +----+-------------+-----------+---------+ | 3 | First Link | 2 | true | +----+-------------+-----------+---------+

?

創(chuàng)建hdfs-connector類型的connector:

create link -cid 1Name: Second LinkHDFS URI: hdfs://hadoop000:8020New link was successfully created with validation status OK and persistent id 4

?

show link +----+-------------+-----------+---------+ | Id | Name | Connector | Enabled | +----+-------------+-----------+---------+ | 3 | First Link | 2 | true | | 4 | Second Link | 1 | true | +----+-------------+-----------+---------+

?

show link -all2 link(s) to show: link with id 3 and name First Link (Enabled: true, Created by null at 15-2-2 ??11:28, Updated by null at 15-2-2 ??11:28)Using Connector id 2Link configurationJDBC Driver Class: com.mysql.jdbc.DriverJDBC Connection String: jdbc:mysql://hadoop000:3306/hive Username: rootPassword: JDBC Connection Properties: protocol = tcplink with id 4 and name Second Link (Enabled: true, Created by null at 15-2-2 ??11:32, Updated by null at 15-2-2 ??11:32)Using Connector id 1Link configurationHDFS URI: hdfs://hadoop000:8020

?

根據(jù)connector id創(chuàng)建job:

create job -f 3 -t 4Creating job for links with from id 3 and to id 4Please fill following values to create new job objectName: SqoopyFrom database configurationSchema name: hiveTable name: TBLSTable SQL statement: Table column names: Partition column name: Null value allowed for the partition column: Boundary query: ToJob configurationOutput format: 0 : TEXT_FILE1 : SEQUENCE_FILEChoose: 0Compression format: 0 : NONE1 : DEFAULT2 : DEFLATE3 : GZIP4 : BZIP25 : LZO6 : LZ47 : SNAPPY8 : CUSTOMChoose: 0Custom compression format: Output directory: hdfs://hadoop000:8020/sqoop2/tbls_import_demo_sqoop1.99.4 Throttling resourcesExtractors: Loaders: New job was successfully created with validation status OK and persistent id 2

?

查詢所有job:?

show job +----+--------+----------------+--------------+---------+ | Id | Name | From Connector | To Connector | Enabled | +----+--------+----------------+--------------+---------+ | 2 | Sqoopy | 2 | 1 | true | +----+--------+----------------+--------------+---------+

?

啟動(dòng)指定的job: ?該job執(zhí)行完后查看HDFS上的文件(hdfs fs -ls?hdfs://hadoop000:8020/sqoop2/tbls_import_demo_sqoop1.99.4/)

start job --jid 2

?

查看指定job的執(zhí)行狀態(tài):

status job --jid 2

?

停止指定的job:

stop job --jid 2

?

在start job(如:start job --jid 2)時(shí)常見(jiàn)錯(cuò)誤:

Exception has occurred during processing command Exception: org.apache.sqoop.common.SqoopException Message: CLIENT_0001:Server has returned exception

在sqoop客戶端設(shè)置查看job詳情:

set option --name verbose --value true show job --jid 2

?

轉(zhuǎn)載于:https://www.cnblogs.com/luogankun/p/4267442.html

總結(jié)

以上是生活随笔為你收集整理的Sqoop2入门之导入关系型数据库数据到HDFS上(sqoop2-1.99.4版本)的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。

如果覺(jué)得生活随笔網(wǎng)站內(nèi)容還不錯(cuò),歡迎將生活随笔推薦給好友。