sqoop1 mysql to hdfs
程序员文章站
2022-04-03 19:57:34
...
0.准备工作 hadoop 服务器
10.156.50.35 yanfabu2-35.base.app.dev.yf zk1 hadoop1 master1 master 10.156.50.36 yanfabu2-36.base.app.dev.yf zk2 hadoop2 master2 10.156.50.37 yanfabu2-37.base.app.dev.yf zk3 hadoop3 slaver1
1.解压 sqoop
tar xf sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz scp sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz zkkafka@10.156.50.37:/home/zkkafka/ scp sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz zkkafka@10.156.50.36:/home/zkkafka/ rm -rf sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz mv sqoop-1.4.7.bin__hadoop-2.6.0 sqoop
2.修改环境变量
vim ~/.bash_profile export SQOOP_HOME=/home/zkkafka/sqoop export PATH=$PATH:$SQOOP_HOME/bin source ~/.bash_profile scp ~/.bash_profile zkkafka@10.156.50.36:/home/zkkafka/ scp ~/.bash_profile zkkafka@10.156.50.37:/home/zkkafka/
3. 修改sqoop-env.sh
cp sqoop-env-template.sh sqoop-env.sh vim sqoop-env.sh export HADOOP_COMMON_HOME=/home/zkkafka/hadoop export HADOOP_MAPRED_HOME=/home/zkkafka/hadoop export HIVE_HOME=/home/zkkafka/hive export HBASE_HOME=/home/zkkafka/hbase scp ../conf/* zkkafka@10.156.50.36:/home/zkkafka/sqoop/conf/ scp ../conf/* zkkafka@10.156.50.37:/home/zkkafka/sqoop/conf/
4.修改 sqoop-site.xml
<property> <name>sqoop.metastore.client.record.password</name> <value>true</value> <description>If true, allow saved passwords in the metastore. </description> </property>
5. 上传mysql的驱动到sqoop的lib下
pwd /home/zkkafka/sqoop/lib cp /home/zkkafka/hive/lib/mysql-connector-java-5.1.47.jar /home/zkkafka/sqoop/lib/ 解决 :Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR cp /home/zkkafka/hive/lib/hive-exec-**.jar /home/zkkafka/sqoop/lib/ 解决:Please set $ACCUMULO_HOME to the root of your Accumulo installation. cp /home/zkkafka/hadoop/share/hadoop/tools/lib/aws-java-sdk**.jar /home/zkkafka/sqoop/lib/ 解决:ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.NullPointerException vim sqoop-env.sh export HCAT_HOME=/home/zkkafka/hive/tmp/HiveJobsLog http://www.java2s.com/Code/Jar/j/Downloadjavajsonjar.htm scp java-json.jar zkkafka@10.156.50.37:/home/zkkafka/sqoop/lib/ http://www.java2s.com/Code/Jar/j/Downloadjavajsonschemajar.htm scp java-json-schema.jar zkkafka@10.156.50.36:/home/zkkafka/sqoop/lib/ scp json-20160212.jar zkkafka@10.156.50.37:/home/zkkafka/sqoop/lib/
6.sqoop 使用
6.1 测试sqoop 安装配置
sqoop -help
[zkkafka@yanfabu2-35 sqoop]$ sqoop help Warning: /home/zkkafka/sqoop/../hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /home/zkkafka/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 19/05/29 11:14:16 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7 usage: sqoop COMMAND [ARGS] Available commands: codegen Generate code to interact with database records create-hive-table Import a table definition into Hive eval Evaluate a SQL statement and display the results export Export an HDFS directory to a database table help List available commands import Import a table from a database to HDFS import-all-tables Import tables from a database to HDFS import-mainframe Import datasets from a mainframe server to HDFS job Work with saved jobs list-databases List available databases on a server list-tables List available tables in a database merge Merge results of incremental imports metastore Run a standalone Sqoop metastore version Display version information See 'sqoop help COMMAND' for information on a specific command.
6.2 sqoop list-tables使用
sqoop list-tables --username hive --password hive --connect jdbc:mysql://10.156.50.36:3306/metastore?characterEncoding=UTF-8
[zkkafka@yanfabu2-35 sqoop]$ sqoop list-tables --username hive --password hive --connect jdbc:mysql://10.156.50.36:3306/metastore?characterEncoding=UTF-8 Warning: /home/zkkafka/sqoop/../hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /home/zkkafka/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 19/05/29 11:17:53 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7 19/05/29 11:17:53 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 19/05/29 11:17:53 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. Wed May 29 11:17:54 CST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification. AUX_TABLE BUCKETING_COLS CDS COLUMNS_V2 COMPACTION_QUEUE COMPLETED_COMPACTIONS COMPLETED_TXN_COMPONENTS DATABASE_PARAMS DBS DB_PRIVS DELEGATION_TOKENS FUNCS FUNC_RU GLOBAL_PRIVS HIVE_LOCKS IDXS INDEX_PARAMS KEY_CONSTRAINTS MASTER_KEYS NEXT_COMPACTION_QUEUE_ID NEXT_LOCK_ID NEXT_TXN_ID NOTIFICATION_LOG NOTIFICATION_SEQUENCE NUCLEUS_TABLES PARTITIONS PARTITION_EVENTS PARTITION_KEYS PARTITION_KEY_VALS PARTITION_PARAMS PART_COL_PRIVS PART_COL_STATS PART_PRIVS ROLES ROLE_MAP SDS SD_PARAMS SEQUENCE_TABLE SERDES SERDE_PARAMS SKEWED_COL_NAMES SKEWED_COL_VALUE_LOC_MAP SKEWED_STRING_LIST SKEWED_STRING_LIST_VALUES SKEWED_VALUES SORT_COLS TABLE_PARAMS TAB_COL_STATS TBLS TBL_COL_PRIVS TBL_PRIVS TXNS TXN_COMPONENTS TYPES TYPE_FIELDS VERSION WRITE_SET
7.准备mysql
/* Navicat MySQL Data Transfer Source Server : 10.156.50.36 hive Source Server Version : 50717 Source Host : 10.156.50.36:3306 Source Database : mqh Target Server Type : MYSQL Target Server Version : 50717 File Encoding : 65001 Date: 2019-05-29 11:48:53 */ SET FOREIGN_KEY_CHECKS=0; -- ---------------------------- -- Table structure for `app` -- ---------------------------- DROP TABLE IF EXISTS `app`; CREATE TABLE `app` ( `appid` int(11) DEFAULT NULL, `appname` varchar(255) DEFAULT NULL ) ENGINE=InnoDB DEFAULT CHARSET=utf8; -- ---------------------------- -- Records of app -- ---------------------------- INSERT INTO `app` VALUES ('1', 'kaxinxiaoxiaole'); INSERT INTO `app` VALUES ('2', 'wangzherognyao'); INSERT INTO `app` VALUES ('3', 'cijizhanchang'); -- ---------------------------- -- Table structure for `device` -- ---------------------------- DROP TABLE IF EXISTS `device`; CREATE TABLE `device` ( `deviceid` int(11) DEFAULT NULL, `appid` int(11) DEFAULT NULL, `register_time` datetime DEFAULT NULL ) ENGINE=InnoDB DEFAULT CHARSET=utf8; -- ---------------------------- -- Records of device -- ---------------------------- INSERT INTO `device` VALUES ('1', '1', '2019-05-29 11:25:54'); INSERT INTO `device` VALUES ('2', '1', '2019-05-29 11:26:04'); INSERT INTO `device` VALUES ('3', '1', '2019-05-30 11:26:14'); INSERT INTO `device` VALUES ('4', '1', '2019-05-29 11:26:30'); INSERT INTO `device` VALUES ('5', '2', '2019-05-29 11:26:38'); INSERT INTO `device` VALUES ('6', '3', '2019-05-29 11:26:45');
8.sqoop 一次性mysql导入数据库到hive
sqoop import \ --connect jdbc:mysql://10.156.50.36:3306/mqh \ --username root \ --password root \ --table device \ --fields-terminated-by '\t' \ --delete-target-dir \ --num-mappers 1 \ --hive-import \ --hive-database test \ --hive-table device;
[zkkafka@yanfabu2-35 ~]$ sqoop import \ > --connect jdbc:mysql://10.156.50.36:3306/mqh \ > --username root \ > --password root \ > --table device \ > --fields-terminated-by '\t' \ > --delete-target-dir \ > --num-mappers 1 \ > --hive-import \ > --hive-database test \ > --hive-table device; Warning: /home/zkkafka/sqoop/../hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /home/zkkafka/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 19/05/29 14:04:02 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7 19/05/29 14:04:02 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 19/05/29 14:04:03 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 19/05/29 14:04:03 INFO tool.CodeGenTool: Beginning code generation Wed May 29 14:04:03 CST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification. 19/05/29 14:04:03 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `device` AS t LIMIT 1 19/05/29 14:04:03 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `device` AS t LIMIT 1 19/05/29 14:04:03 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/zkkafka/hadoop 注: /tmp/sqoop-zkkafka/compile/35bdce3f9edb048d842d71d7b7f25f85/device.java使用或覆盖了已过时的 API。 注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。 19/05/29 14:04:06 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-zkkafka/compile/35bdce3f9edb048d842d71d7b7f25f85/device.jar SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/zkkafka/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/zkkafka/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 19/05/29 14:04:07 INFO tool.ImportTool: Destination directory device is not present, hence not deleting. 19/05/29 14:04:07 WARN manager.MySQLManager: It looks like you are importing from mysql. 19/05/29 14:04:07 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 19/05/29 14:04:07 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 19/05/29 14:04:07 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 19/05/29 14:04:07 INFO mapreduce.ImportJobBase: Beginning import of device 19/05/29 14:04:07 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar 19/05/29 14:04:08 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps Wed May 29 14:04:17 CST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification. 19/05/29 14:04:17 INFO db.DBInputFormat: Using read commited transaction isolation 19/05/29 14:04:17 INFO mapreduce.JobSubmitter: number of splits:1 19/05/29 14:04:18 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1558676658010_0012 19/05/29 14:04:18 INFO impl.YarnClientImpl: Submitted application application_1558676658010_0012 19/05/29 14:04:18 INFO mapreduce.Job: The url to track the job: http://master1:8088/proxy/application_1558676658010_0012/ 19/05/29 14:04:18 INFO mapreduce.Job: Running job: job_1558676658010_0012 19/05/29 14:04:28 INFO mapreduce.Job: Job job_1558676658010_0012 running in uber mode : false 19/05/29 14:04:28 INFO mapreduce.Job: map 0% reduce 0% 19/05/29 14:04:34 INFO mapreduce.Job: map 100% reduce 0% 19/05/29 14:04:35 INFO mapreduce.Job: Job job_1558676658010_0012 completed successfully 19/05/29 14:04:35 INFO mapreduce.Job: Counters: 30 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=129039 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=87 HDFS: Number of bytes written=156 HDFS: Number of read operations=4 HDFS: Number of large read operations=0 HDFS: Number of write operations=2 Job Counters Launched map tasks=1 Other local map tasks=1 Total time spent by all maps in occupied slots (ms)=3989 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=3989 Total vcore-milliseconds taken by all map tasks=3989 Total megabyte-milliseconds taken by all map tasks=4084736 Map-Reduce Framework Map input records=6 Map output records=6 Input split bytes=87 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=76 CPU time spent (ms)=1480 Physical memory (bytes) snapshot=170278912 Virtual memory (bytes) snapshot=2107375616 Total committed heap usage (bytes)=92274688 File Input Format Counters Bytes Read=0 File Output Format Counters Bytes Written=156 19/05/29 14:04:36 INFO mapreduce.ImportJobBase: Transferred 156 bytes in 27.9729 seconds (5.5768 bytes/sec) 19/05/29 14:04:36 INFO mapreduce.ImportJobBase: Retrieved 6 records. 19/05/29 14:04:36 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table device Wed May 29 14:04:36 CST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification. 19/05/29 14:04:36 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `device` AS t LIMIT 1 19/05/29 14:04:36 WARN hive.TableDefWriter: Column register_time had to be cast to a less precise type in Hive 19/05/29 14:04:36 INFO hive.HiveImport: Loading uploaded data into Hive 19/05/29 14:04:36 INFO conf.HiveConf: Found configuration file file:/home/zkkafka/hive/conf/hive-site.xml 19/05/29 14:04:36 WARN conf.HiveConf: HiveConf of name hive.files.umask.value does not exist 19/05/29 14:04:36 WARN conf.HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist 19/05/29 14:04:49 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings. 19/05/29 14:04:49 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/home/zkkafka/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] 19/05/29 14:04:49 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/home/zkkafka/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 19/05/29 14:04:49 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/home/zkkafka/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] 19/05/29 14:04:49 INFO hive.HiveImport: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 19/05/29 14:04:49 INFO hive.HiveImport: SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 19/05/29 14:04:54 INFO hive.HiveImport: 19/05/29 14:04:54 INFO hive.HiveImport: Logging initialized using configuration in file:/home/zkkafka/hive/conf/hive-log4j2.properties Async: true 19/05/29 14:05:01 INFO hive.HiveImport: OK 19/05/29 14:05:01 INFO hive.HiveImport: Time taken: 4.585 seconds 19/05/29 14:05:02 INFO hive.HiveImport: Loading data to table test.device 19/05/29 14:05:06 INFO hive.HiveImport: OK 19/05/29 14:05:06 INFO hive.HiveImport: Time taken: 4.722 seconds 19/05/29 14:05:07 INFO hive.HiveImport: Hive import complete. 19/05/29 14:05:07 INFO hive.HiveImport: Export directory is contains the _SUCCESS file only, removing the directory.
hive> select * from device; OK 1 1 2019-05-29 11:25:54 2 1 2019-05-29 11:26:04 3 1 2019-05-30 11:26:14 4 1 2019-05-29 11:26:30 5 2 2019-05-29 11:26:38 6 3 2019-05-29 11:26:45
8.sqoop 增量导入
sqoop import \ --connect jdbc:mysql://10.156.50.36:3306/mqh \ --username root \ --password root \ --table device \ --target-dir=/hdfs/sqoop/device \ --fields-terminated-by '\t' \ --num-mappers 1 \ --hive-import \ --hive-database test \ --hive-table device \ --incremental append \ --check-column deviceid \ --last-value 10 \ --m 1
[zkkafka@yanfabu2-35 ~]$ sqoop import \ > --connect jdbc:mysql://10.156.50.36:3306/mqh \ > --username root \ > --password root \ > --table device \ > --target-dir=/hdfs/sqoop/device \ > --fields-terminated-by '\t' \ > --num-mappers 1 \ > --hive-import \ > --hive-database test \ > --hive-table device \ > --incremental append \ > --check-column deviceid \ > --last-value 10 \ > --m 1 19/06/04 16:56:40 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7 19/06/04 16:56:40 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 19/06/04 16:56:41 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 19/06/04 16:56:41 INFO tool.CodeGenTool: Beginning code generation Tue Jun 04 16:56:41 CST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification. 19/06/04 16:56:42 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `device` AS t LIMIT 1 19/06/04 16:56:42 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `device` AS t LIMIT 1 19/06/04 16:56:42 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/zkkafka/hadoop 注: /tmp/sqoop-zkkafka/compile/66854d2f9121b88905b0b82b5730b38b/device.java使用或覆盖了已过时的 API。 注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。 19/06/04 16:56:44 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-zkkafka/compile/66854d2f9121b88905b0b82b5730b38b/device.jar 19/06/04 16:56:44 INFO tool.ImportTool: Maximal id query for free form incremental import: SELECT MAX(`deviceid`) FROM `device` 19/06/04 16:56:44 INFO tool.ImportTool: Incremental import based on column `deviceid` 19/06/04 16:56:44 INFO tool.ImportTool: Lower bound value: 10 19/06/04 16:56:44 INFO tool.ImportTool: Upper bound value: 11 19/06/04 16:56:44 WARN manager.MySQLManager: It looks like you are importing from mysql. 19/06/04 16:56:44 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 19/06/04 16:56:44 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 19/06/04 16:56:44 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 19/06/04 16:56:44 INFO mapreduce.ImportJobBase: Beginning import of device SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/zkkafka/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/zkkafka/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 19/06/04 16:56:45 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar 19/06/04 16:56:46 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps Tue Jun 04 16:56:56 CST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification. 19/06/04 16:56:56 INFO db.DBInputFormat: Using read commited transaction isolation 19/06/04 16:56:56 INFO mapreduce.JobSubmitter: number of splits:1 19/06/04 16:56:57 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1559370613628_0007 19/06/04 16:56:58 INFO impl.YarnClientImpl: Submitted application application_1559370613628_0007 19/06/04 16:56:58 INFO mapreduce.Job: The url to track the job: http://master1:8088/proxy/application_1559370613628_0007/ 19/06/04 16:56:58 INFO mapreduce.Job: Running job: job_1559370613628_0007 19/06/04 16:57:07 INFO mapreduce.Job: Job job_1559370613628_0007 running in uber mode : false 19/06/04 16:57:07 INFO mapreduce.Job: map 0% reduce 0% 19/06/04 16:57:15 INFO mapreduce.Job: map 100% reduce 0% 19/06/04 16:57:16 INFO mapreduce.Job: Job job_1559370613628_0007 completed successfully 19/06/04 16:57:16 INFO mapreduce.Job: Counters: 30 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=131266 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=87 HDFS: Number of bytes written=27 HDFS: Number of read operations=4 HDFS: Number of large read operations=0 HDFS: Number of write operations=2 Job Counters Launched map tasks=1 Other local map tasks=1 Total time spent by all maps in occupied slots (ms)=4312 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=4312 Total vcore-milliseconds taken by all map tasks=4312 Total megabyte-milliseconds taken by all map tasks=4415488 Map-Reduce Framework Map input records=1 Map output records=1 Input split bytes=87 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=76 CPU time spent (ms)=1240 Physical memory (bytes) snapshot=170573824 Virtual memory (bytes) snapshot=2106789888 Total committed heap usage (bytes)=92798976 File Input Format Counters Bytes Read=0 File Output Format Counters Bytes Written=27 19/06/04 16:57:16 INFO mapreduce.ImportJobBase: Transferred 27 bytes in 30.087 seconds (0.8974 bytes/sec) 19/06/04 16:57:16 INFO mapreduce.ImportJobBase: Retrieved 1 records. 19/06/04 16:57:16 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table device 19/06/04 16:57:16 INFO util.AppendUtils: Creating missing output directory - device Tue Jun 04 16:57:16 CST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification. 19/06/04 16:57:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `device` AS t LIMIT 1 19/06/04 16:57:16 WARN hive.TableDefWriter: Column register_time had to be cast to a less precise type in Hive 19/06/04 16:57:16 INFO hive.HiveImport: Loading uploaded data into Hive 19/06/04 16:57:16 INFO conf.HiveConf: Found configuration file file:/home/zkkafka/hive/conf/hive-site.xml 19/06/04 16:57:17 WARN conf.HiveConf: HiveConf of name hive.files.umask.value does not exist 19/06/04 16:57:17 WARN conf.HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist 19/06/04 16:57:34 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings. 19/06/04 16:57:34 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/home/zkkafka/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] 19/06/04 16:57:34 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/home/zkkafka/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 19/06/04 16:57:34 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/home/zkkafka/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] 19/06/04 16:57:34 INFO hive.HiveImport: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 19/06/04 16:57:34 INFO hive.HiveImport: SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 19/06/04 16:57:37 INFO hive.HiveImport: 19/06/04 16:57:37 INFO hive.HiveImport: Logging initialized using configuration in file:/home/zkkafka/hive/conf/hive-log4j2.properties Async: true 19/06/04 16:57:47 INFO hive.HiveImport: OK 19/06/04 16:57:47 INFO hive.HiveImport: Time taken: 5.746 seconds 19/06/04 16:57:48 INFO hive.HiveImport: Loading data to table test.device 19/06/04 16:57:52 INFO hive.HiveImport: OK 19/06/04 16:57:52 INFO hive.HiveImport: Time taken: 5.208 seconds 19/06/04 16:57:53 INFO hive.HiveImport: Hive import complete. 19/06/04 16:57:53 INFO hive.HiveImport: Export directory is empty, removing it. 19/06/04 16:57:53 INFO tool.ImportTool: Incremental import complete! To run another incremental import of all data following this import, supply the following arguments: 19/06/04 16:57:53 INFO tool.ImportTool: --incremental append 19/06/04 16:57:53 INFO tool.ImportTool: --check-column deviceid 19/06/04 16:57:53 INFO tool.ImportTool: --last-value 11 19/06/04 16:57:53 INFO tool.ImportTool: (Consider saving this with 'sqoop job --create')
9. 其他相关命令
--create <job-id> 创建一个新的job. --delete <job-id> 删除job --exec <job-id> 执行job --show <job-id> 显示job的参数 --list 列出所有的job sqoop job --list sqoop job --delete sqoop_job_device sqoop job --exec sqoop_job_device sqoop job --show sqoop_job_device
[zkkafka@yanfabu2-35 conf]$ sqoop job --list 19/05/29 15:15:58 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7 Available jobs: sqoop_job_device [zkkafka@yanfabu2-35 conf]$ sqoop job --show sqoop_job_device 19/05/29 15:16:10 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7 Job: sqoop_job_device Tool: import Options: ---------------------------- verbose = false hcatalog.drop.and.create.table = false codegen.output.delimiters.escape = 0 codegen.output.delimiters.enclose.required = false codegen.input.delimiters.field = 0 split.limit = null hbase.create.table = false mainframe.input.dataset.type = p db.require.password = false skip.dist.cache = false hdfs.append.dir = false codegen.input.delimiters.escape = 0 import.fetch.size = null accumulo.create.table = false codegen.input.delimiters.enclose.required = false reset.onemapper = false codegen.output.delimiters.record = 10 import.max.inline.lob.size = 16777216 sqoop.throwOnError = false hbase.bulk.load.enabled = false hcatalog.create.table = false db.clear.staging.table = false codegen.input.delimiters.record = 0 enable.compression = false hive.overwrite.table = false hive.import = false codegen.input.delimiters.enclose = 0 accumulo.batch.size = 10240000 hive.drop.delims = false customtool.options.jsonmap = {} codegen.output.delimiters.enclose = 0 hdfs.delete-target.dir = false codegen.output.dir = . codegen.auto.compile.dir = true relaxed.isolation = false mapreduce.num.mappers = 4 accumulo.max.latency = 5000 import.direct.split.size = 0 sqlconnection.metadata.transaction.isolation.level = 2 codegen.output.delimiters.field = 44 export.new.update = UpdateOnly incremental.mode = None hdfs.file.format = TextFile sqoop.oracle.escaping.disabled = true codegen.compile.dir = /tmp/sqoop-zkkafka/compile/424a3807a29b831b82c7cd119bb44e9f direct.import = false temporary.dirRoot = _sqoop hive.fail.table.exists = false db.batch = false
捐助开发者
在兴趣的驱动下,写一个免费
的东西,有欣喜,也还有汗水,希望你喜欢我的作品,同时也能支持一下。 当然,有钱捧个钱场(支持支付宝和微信 以及扣扣群),没钱捧个人场,谢谢各位。
个人主页:http://knight-black-bob.iteye.com/
谢谢您的赞助,我会做的更好!
上一篇: win7只能上qq不能上网怎么解决
下一篇: window10鼠标不见了怎么办