日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

sqoop1.4.5 导入 hive IOException running import job: java.io.IOException: Hive exited with status 1

發布時間:2024/9/30 编程问答 41 豆豆
生活随笔 收集整理的這篇文章主要介紹了 sqoop1.4.5 导入 hive IOException running import job: java.io.IOException: Hive exited with status 1 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

sqoop 導入 hive

hive.HiveImport: Exception in thread "main" java.lang.NoSuchMethodError: org.apache.thrift.EncodingUtils.setBit(BIZ)B

ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 1

出現上面的錯誤

[jifeng@jifeng02 sqoop]$ bin/sqoop import --connect jdbc:mysql://10.X.X.X:3306/lir --table project --username dss -P --hive-import -- --default-character-set=utf-8 Warning: /home/jifeng/sqoop/../hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /home/jifeng/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. Warning: $HADOOP_HOME is deprecated.14/09/08 01:25:36 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5 Enter password: 14/09/08 01:25:40 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override 14/09/08 01:25:40 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc. 14/09/08 01:25:40 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 14/09/08 01:25:40 INFO tool.CodeGenTool: Beginning code generation 14/09/08 01:25:40 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `project` AS t LIMIT 1 14/09/08 01:25:40 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `project` AS t LIMIT 1 14/09/08 01:25:40 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/jifeng/hadoop/hadoop-1.2.1 注: /tmp/sqoop-jifeng/compile/84b064476bf25fd09fa7171d6baf7a96/project.java使用或覆蓋了已過時的 API。 注: 有關詳細信息, 請使用 -Xlint:deprecation 重新編譯。 14/09/08 01:25:41 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-jifeng/compile/84b064476bf25fd09fa7171d6baf7a96/project.jar 14/09/08 01:25:41 WARN manager.MySQLManager: It looks like you are importing from mysql. 14/09/08 01:25:41 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 14/09/08 01:25:41 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 14/09/08 01:25:41 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 14/09/08 01:25:41 INFO mapreduce.ImportJobBase: Beginning import of project 14/09/08 01:25:42 INFO db.DBInputFormat: Using read commited transaction isolation 14/09/08 01:25:42 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`id`), MAX(`id`) FROM `project` 14/09/08 01:25:42 INFO mapred.JobClient: Running job: job_201409072150_0002 14/09/08 01:25:43 INFO mapred.JobClient: map 0% reduce 0% 14/09/08 01:25:52 INFO mapred.JobClient: map 66% reduce 0% 14/09/08 01:25:53 INFO mapred.JobClient: map 100% reduce 0% 14/09/08 01:25:54 INFO mapred.JobClient: Job complete: job_201409072150_0002 14/09/08 01:25:54 INFO mapred.JobClient: Counters: 18 14/09/08 01:25:54 INFO mapred.JobClient: Job Counters 14/09/08 01:25:54 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=13548 14/09/08 01:25:54 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0 14/09/08 01:25:54 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0 14/09/08 01:25:54 INFO mapred.JobClient: Launched map tasks=3 14/09/08 01:25:54 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0 14/09/08 01:25:54 INFO mapred.JobClient: File Output Format Counters 14/09/08 01:25:54 INFO mapred.JobClient: Bytes Written=201 14/09/08 01:25:54 INFO mapred.JobClient: FileSystemCounters 14/09/08 01:25:54 INFO mapred.JobClient: HDFS_BYTES_READ=295 14/09/08 01:25:54 INFO mapred.JobClient: FILE_BYTES_WRITTEN=204759 14/09/08 01:25:54 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=201 14/09/08 01:25:54 INFO mapred.JobClient: File Input Format Counters 14/09/08 01:25:54 INFO mapred.JobClient: Bytes Read=0 14/09/08 01:25:54 INFO mapred.JobClient: Map-Reduce Framework 14/09/08 01:25:54 INFO mapred.JobClient: Map input records=3 14/09/08 01:25:54 INFO mapred.JobClient: Physical memory (bytes) snapshot=163741696 14/09/08 01:25:54 INFO mapred.JobClient: Spilled Records=0 14/09/08 01:25:54 INFO mapred.JobClient: CPU time spent (ms)=1490 14/09/08 01:25:54 INFO mapred.JobClient: Total committed heap usage (bytes)=64421888 14/09/08 01:25:54 INFO mapred.JobClient: Virtual memory (bytes) snapshot=1208795136 14/09/08 01:25:54 INFO mapred.JobClient: Map output records=3 14/09/08 01:25:54 INFO mapred.JobClient: SPLIT_RAW_BYTES=295 14/09/08 01:25:54 INFO mapreduce.ImportJobBase: Transferred 201 bytes in 12.6733 seconds (15.8601 bytes/sec) 14/09/08 01:25:54 INFO mapreduce.ImportJobBase: Retrieved 3 records. 14/09/08 01:25:54 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `project` AS t LIMIT 1 14/09/08 01:25:54 WARN hive.TableDefWriter: Column create_at had to be cast to a less precise type in Hive 14/09/08 01:25:54 WARN hive.TableDefWriter: Column update_at had to be cast to a less precise type in Hive 14/09/08 01:25:54 INFO hive.HiveImport: Removing temporary files from import process: hdfs://jifeng01:9000/user/jifeng/project/_logs 14/09/08 01:25:54 INFO hive.HiveImport: Loading uploaded data into Hive 14/09/08 01:25:55 INFO hive.HiveImport: 14/09/08 01:25:55 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/home/jifeng/hadoop/hive-0.12.0-bin/lib/hive-common-0.12.0.jar!/hive-log4j.properties 14/09/08 01:25:55 INFO hive.HiveImport: Exception in thread "main" java.lang.NoSuchMethodError: org.apache.thrift.EncodingUtils.setBit(BIZ)B 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.metastore.api.StorageDescriptor.setNumBucketsIsSet(StorageDescriptor.java:464) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.metastore.api.StorageDescriptor.setNumBuckets(StorageDescriptor.java:451) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.ql.metadata.Table.getEmptyTable(Table.java:132) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.ql.metadata.Table.<init>(Table.java:105) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.ql.metadata.Hive.newTable(Hive.java:2493) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:904) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:8999) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:8313) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:284) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:441) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:342) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:977) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:737) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614) 14/09/08 01:25:55 INFO hive.HiveImport: at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 14/09/08 01:25:55 INFO hive.HiveImport: at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 14/09/08 01:25:55 INFO hive.HiveImport: at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 14/09/08 01:25:55 INFO hive.HiveImport: at java.lang.reflect.Method.invoke(Method.java:606) 14/09/08 01:25:55 INFO hive.HiveImport: at org.apache.hadoop.util.RunJar.main(RunJar.java:160) 14/09/08 01:25:55 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 1at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:385)at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:335)at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:239)at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:511)at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:601)at org.apache.sqoop.Sqoop.run(Sqoop.java:143)at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)at org.apache.sqoop.Sqoop.main(Sqoop.java:236)[jifeng@jifeng02 sqoop]$ bin/sqoop import --connect jdbc:mysql://10.X.X.X:3306/lir --table project --username dss -P --hive-import -- --default-character-set=utf-8 Warning: /home/jifeng/sqoop/../hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /home/jifeng/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. Warning: $HADOOP_HOME is deprecated.14/09/08 01:28:52 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5 Enter password: 14/09/08 01:28:54 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override 14/09/08 01:28:54 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc. 14/09/08 01:28:55 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 14/09/08 01:28:55 INFO tool.CodeGenTool: Beginning code generation 14/09/08 01:28:55 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `project` AS t LIMIT 1 14/09/08 01:28:55 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `project` AS t LIMIT 1 14/09/08 01:28:55 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/jifeng/hadoop/hadoop-1.2.1 注: /tmp/sqoop-jifeng/compile/b281ae9014edf3aae02818af8d90c978/project.java使用或覆蓋了已過時的 API。 注: 有關詳細信息, 請使用 -Xlint:deprecation 重新編譯。 14/09/08 01:28:56 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-jifeng/compile/b281ae9014edf3aae02818af8d90c978/project.jar 14/09/08 01:28:56 WARN manager.MySQLManager: It looks like you are importing from mysql. 14/09/08 01:28:56 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 14/09/08 01:28:56 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 14/09/08 01:28:56 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 14/09/08 01:28:56 INFO mapreduce.ImportJobBase: Beginning import of project 14/09/08 01:28:56 INFO mapred.JobClient: Cleaning up the staging area hdfs://jifeng01:9000/home/jifeng/hadoop/tmp/mapred/staging/jifeng/.staging/job_201409072150_0003 14/09/08 01:28:56 ERROR security.UserGroupInformation: PriviledgedActionException as:jifeng cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory project already exists 14/09/08 01:28:56 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory project already existsat org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:137)at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:973)at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:415)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)at org.apache.hadoop.mapreduce.Job.submit(Job.java:550)at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580)at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:186)at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:159)at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:247)at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:665)at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:601)at org.apache.sqoop.Sqoop.run(Sqoop.java:143)at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)at org.apache.sqoop.Sqoop.main(Sqoop.java:236) 解決:

Hbase和hive的libthrift版本分別是libthrift-0.8.0.jar,libthrift-0.9.0.jar?

copy?libthrift-0.9.0.jar 到sqoop/lib目錄下,問題解決。


再次支持導入語句的時候出現目錄已經存在的問題

14/09/08 01:28:56 ERROR security.UserGroupInformation: PriviledgedActionException as:jifeng cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory project already exists
14/09/08 01:28:56 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory project already exists

刪除目錄問題解決

[jifeng@jifeng01 ~]$ hadoop dfs -rmr /user/jifeng/project Warning: $HADOOP_HOME is deprecated.Deleted hdfs://jifeng01:9000/user/jifeng/project [jifeng@jifeng01 ~]$
[jifeng@jifeng02 sqoop]$ bin/sqoop import --connect jdbc:mysql://10.X.X.:3306/lir --table project --username dss -P --hive-import -- --default-character-set=utf-8 Warning: /home/jifeng/sqoop/../hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /home/jifeng/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. Warning: $HADOOP_HOME is deprecated.14/09/08 01:58:02 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5 Enter password: 14/09/08 01:58:07 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override 14/09/08 01:58:07 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc. 14/09/08 01:58:07 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 14/09/08 01:58:07 INFO tool.CodeGenTool: Beginning code generation 14/09/08 01:58:07 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `project` AS t LIMIT 1 14/09/08 01:58:07 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `project` AS t LIMIT 1 14/09/08 01:58:07 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/jifeng/hadoop/hadoop-1.2.1 注: /tmp/sqoop-jifeng/compile/437963d234f778a27f8aa27fec8e18aa/project.java使用或覆蓋了已過時的 API。 注: 有關詳細信息, 請使用 -Xlint:deprecation 重新編譯。 14/09/08 01:58:08 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-jifeng/compile/437963d234f778a27f8aa27fec8e18aa/project.jar 14/09/08 01:58:08 WARN manager.MySQLManager: It looks like you are importing from mysql. 14/09/08 01:58:08 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 14/09/08 01:58:08 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 14/09/08 01:58:08 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 14/09/08 01:58:08 INFO mapreduce.ImportJobBase: Beginning import of project 14/09/08 01:58:08 INFO db.DBInputFormat: Using read commited transaction isolation 14/09/08 01:58:08 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`id`), MAX(`id`) FROM `project` 14/09/08 01:58:09 INFO mapred.JobClient: Running job: job_201409072150_0005 14/09/08 01:58:10 INFO mapred.JobClient: map 0% reduce 0% 14/09/08 01:58:15 INFO mapred.JobClient: map 33% reduce 0% 14/09/08 01:58:16 INFO mapred.JobClient: map 66% reduce 0% 14/09/08 01:58:18 INFO mapred.JobClient: map 100% reduce 0% 14/09/08 01:58:20 INFO mapred.JobClient: Job complete: job_201409072150_0005 14/09/08 01:58:20 INFO mapred.JobClient: Counters: 18 14/09/08 01:58:20 INFO mapred.JobClient: Job Counters 14/09/08 01:58:20 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=11968 14/09/08 01:58:20 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0 14/09/08 01:58:20 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0 14/09/08 01:58:20 INFO mapred.JobClient: Launched map tasks=3 14/09/08 01:58:20 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0 14/09/08 01:58:20 INFO mapred.JobClient: File Output Format Counters 14/09/08 01:58:20 INFO mapred.JobClient: Bytes Written=201 14/09/08 01:58:20 INFO mapred.JobClient: FileSystemCounters 14/09/08 01:58:20 INFO mapred.JobClient: HDFS_BYTES_READ=295 14/09/08 01:58:20 INFO mapred.JobClient: FILE_BYTES_WRITTEN=206338 14/09/08 01:58:20 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=201 14/09/08 01:58:20 INFO mapred.JobClient: File Input Format Counters 14/09/08 01:58:20 INFO mapred.JobClient: Bytes Read=0 14/09/08 01:58:20 INFO mapred.JobClient: Map-Reduce Framework 14/09/08 01:58:20 INFO mapred.JobClient: Map input records=3 14/09/08 01:58:20 INFO mapred.JobClient: Physical memory (bytes) snapshot=163192832 14/09/08 01:58:20 INFO mapred.JobClient: Spilled Records=0 14/09/08 01:58:20 INFO mapred.JobClient: CPU time spent (ms)=1480 14/09/08 01:58:20 INFO mapred.JobClient: Total committed heap usage (bytes)=64421888 14/09/08 01:58:20 INFO mapred.JobClient: Virtual memory (bytes) snapshot=1208586240 14/09/08 01:58:20 INFO mapred.JobClient: Map output records=3 14/09/08 01:58:20 INFO mapred.JobClient: SPLIT_RAW_BYTES=295 14/09/08 01:58:20 INFO mapreduce.ImportJobBase: Transferred 201 bytes in 11.6303 seconds (17.2825 bytes/sec) 14/09/08 01:58:20 INFO mapreduce.ImportJobBase: Retrieved 3 records. 14/09/08 01:58:20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `project` AS t LIMIT 1 14/09/08 01:58:20 WARN hive.TableDefWriter: Column create_at had to be cast to a less precise type in Hive 14/09/08 01:58:20 WARN hive.TableDefWriter: Column update_at had to be cast to a less precise type in Hive 14/09/08 01:58:20 INFO hive.HiveImport: Removing temporary files from import process: hdfs://jifeng01:9000/user/jifeng/project/_logs 14/09/08 01:58:20 INFO hive.HiveImport: Loading uploaded data into Hive 14/09/08 01:58:21 INFO hive.HiveImport: 14/09/08 01:58:21 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/home/jifeng/hadoop/hive-0.12.0-bin/lib/hive-common-0.12.0.jar!/hive-log4j.properties 14/09/08 01:58:27 INFO hive.HiveImport: OK 14/09/08 01:58:27 INFO hive.HiveImport: Time taken: 6.069 seconds 14/09/08 01:58:27 INFO hive.HiveImport: Loading data to table default.project 14/09/08 01:58:27 INFO hive.HiveImport: Table default.project stats: [num_partitions: 0, num_files: 4, num_rows: 0, total_size: 201, raw_data_size: 0] 14/09/08 01:58:27 INFO hive.HiveImport: OK 14/09/08 01:58:27 INFO hive.HiveImport: Time taken: 0.345 seconds 14/09/08 01:58:27 INFO hive.HiveImport: Hive import complete. 14/09/08 01:58:27 INFO hive.HiveImport: Export directory is empty, removing it.


總結

以上是生活随笔為你收集整理的sqoop1.4.5 导入 hive IOException running import job: java.io.IOException: Hive exited with status 1的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。

主站蜘蛛池模板: 琪琪色综合| 欧美aa级 | 色综合久久久 | 操碰97| 99精品无码一区二区 | 古装做爰无遮挡三级 | 91se在线 | 亚洲男人天堂网 | 视频精品一区二区 | 精品国产视频在线 | 神马三级我不卡 | 亚洲男人天堂影院 | 免费无码肉片在线观看 | 91在线无精精品入口 | 免费看成人啪啪 | 人人妻人人做人人爽 | 日本黄色免费看 | av天天操 | 亚洲AV午夜福利精品一级无码 | 97超碰人人模人人人爽人人爱 | 日韩激情四射 | 国产3级在线| www.超碰| 亚洲高清无码久久 | 精品欧美激情精品一区 | 中文字幕精品一区二 | 欧美日韩在线视频一区二区三区 | 男女黄床上色视频 | 高清国产mv在线观看 | 成人福利在线观看 | 4438亚洲最大 | 天天操夜夜爱 | 亚洲欧美一区二区精品久久久 | 日本少妇毛茸茸 | 少妇视频在线 | 99久久久无码国产 | 女同亚洲精品一区二区三 | 超碰在97| 亚洲人人爱 | 国产黄站 | 国产野外作爱视频播放 | 中文字幕一区av | 影音先锋国产 | 婷婷激情社区 | 国产一级影院 | 国产剧情精品 | 国产亚洲精品美女 | 综合av一区| 国产高清无遮挡 | 日本成人黄色片 | 99只有精品 | 精品免费囯产一区二区三区 | 韩国伦理片在线看 | 一区二区三区日韩精品 | 自拍偷拍第| 麻豆久久久久久久 | 美女扒开粉嫩的尿囗给男生桶 | 天天操国产 | 麻豆av在线| 亚洲性免费 | 中出少妇 | 精品成人中文无码专区 | 日本特黄特黄刺激大片 | 6699嫩草久久久精品影院 | 亚洲无打码 | 亚洲AV午夜福利精品一级无码 | 999久久久精品 | 亚洲综合大片69999 | 亚洲激情五月婷婷 | 一级黄色片一级黄色片 | 国产成人久久精品麻豆二区 | 最新中文字幕在线观看 | 爱情岛论坛成人av | 天堂网在线观看视频 | 亚洲激情六月 | 日操操 | 国产伦精品免费视频 | 欧美36p| 毛片啪啪啪 | 婷婷四房综合激情五月 | 大地资源二中文在线影视免费观看 | 九九热国产视频 | 国产伦理一区二区三区 | 国产精品成人一区二区三区电影毛片 | 黄色片视频免费观看 | 天堂аⅴ在线最新版在线 | 天天摸天天碰天天爽天天弄 | 欧美aa在线观看 | 91国产在线播放 | 无码人妻丰满熟妇奶水区码 | 欧美午夜小视频 | 国产视频97| 国产中文字字幕乱码无限 | 美女又爽又黄免费 | 高清视频免费在线观看 | 野战少妇38p | 神马久久久久久久 | 男生操女生免费网站 | 少妇肥臀大白屁股高清 |