作业:大数据之Sqoop的使用
程序员文章站
2024-02-29 10:46:52
...
1 下载sqoop
cd ~/software
wget http://mirrors.hust.edu.cn/apache/sqoop/1.4.6/
2 解压 sqoop
tar -zxf sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz -C ~/app/
3 配置 sqoop
(1)将sqoop目录配置到~/.bash_profile
(2)修改配置文件sqoop-env.sh,修改配置文件之前,重命名配置文件
cd $SQOOP_HOME/conf/
$ mv sqoop-env-template.sh sqoop-env.sh
$ mv sqoop-site-template.xml sqoop-site.xml
修改配置文件sqoop-env.sh
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HIVE_HOME=$HIVE_HOME
export ZOOKEEPER_HOME=$ZK_HOME
export ZOOCFGDIR=$ZK_HOME
(3) 拷贝JDBC驱动到sqoop的lib目录下
cp -a mysql-connector-java-5.1.27-bin.jar $SQOOP_HOME/lib
(4) 验证 SQOOP
bin/sqoop help
Available commands:
codegen Generate code to interact with database records
create-hive-table Import a table definition into Hive
eval Evaluate a SQL statement and display the results
export Export an HDFS directory to a database table
help List available commands
import Import a table from a database to HDFS
import-all-tables Import tables from a database to HDFS
import-mainframe Import datasets from a mainframe server to HDFS
job Work with saved jobs
list-databases List available databases on a server
list-tables List available tables in a database
merge Merge results of incremental imports
metastore Run a standalone Sqoop metastore
version Display version information
See 'sqoop help COMMAND' for information on a specific command.
(5)测试Sqoop是否能够连接成功MySQL数据库
bin/sqoop list-databases --connect jdbc:mysql://192.168.217.129:3306/ --username root --password root
出现坑:
ERROR manager.CatalogQueryManager: Failed to list databases
java.sql.SQLException: Access denied for user 'root'@'hadoop001' (using password: YES)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:957)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3878)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3814)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:871)
at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:4323)
at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1267)
at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2255)
at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2286)
at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2085)
at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:795)
at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:44)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:404)
at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:400)
at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:327)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:215)
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:904)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:59)
at org.apache.sqoop.manager.CatalogQueryManager.listDatabases(CatalogQueryManager.java:57)
at org.apache.sqoop.tool.ListDatabasesTool.run(ListDatabasesTool.java:49)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
19/01/30 15:03:12 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: java.sql.SQLException: Access denied for user 'root'@'hadoop001' (using password: YES)
java.lang.RuntimeException: java.sql.SQLException: Access denied for user 'root'@'hadoop001' (using password: YES)
at org.apache.sqoop.manager.CatalogQueryManager.listDatabases(CatalogQueryManager.java:73)
at org.apache.sqoop.tool.ListDatabasesTool.run(ListDatabasesTool.java:49)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
**Caused by: java.sql.SQLException: Access denied for user 'root'@'hadoop001' (using password: YES)**
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:957)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3878)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3814)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:871)
at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:4323)
at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1267)
at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2255)
at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2286)
at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2085)
at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:795)
at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:44)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:404)
at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:400)
at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:327)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:215)
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:904)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:59)
at org.apache.sqoop.manager.CatalogQueryManager.listDatabases(CatalogQueryManager.java:57)
... 13 more
解决这个问题的方法
进入MySQL
GRANT ALL PRIVILEGES ON *.* TO [email protected]"hadoop001" IDENTIFIED BY "root" WITH GRANT OPTION;
flush privileges;
再试一次
bin/sqoop list-databases --connect jdbc:mysql://192.168.217.129:3306/ --username root --password root
结果如下:
/home/hadoop/app/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /home/hadoop/app/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /home/hadoop/app/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /home/hadoop/app/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
19/01/30 16:05:58 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
19/01/30 16:05:58 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
19/01/30 16:05:58 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
information_schema
company
hive
imooc_project
metastore
mysql
sell
spark
sparksql
test
4 在MySQL中创建数据库和表
$ mysql -uroot -p*******
mysql> create database company;
mysql> create table company.staff(id int(4) primary key not null auto_increment, name varchar(255), sex varchar(255));
mysql> insert into company.staff(name, sex) values('Thomas', 'Male');
mysql> insert into company.staff(name, sex) values('Catalina', 'FeMale');
5 导入数据
bin/sqoop import --connect jdbc:mysql://192.168.217.129:3306/company --username root --password root --table staff --target-dir /user/company --delete-target-dir --num-mappers 1 --fields-terminated-by "\t"
出现坑:
9/01/30 16:47:02 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
19/01/30 16:47:02 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
19/01/30 16:47:03 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
19/01/30 16:47:03 INFO tool.CodeGenTool: Beginning code generation
19/01/30 16:47:03 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `staff` AS t LIMIT 1
19/01/30 16:47:03 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `staff` AS t LIMIT 1
19/01/30 16:47:03 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/app/hadoop-2.6.0-cdh5.7.0
Note: /tmp/sqoop-hadoop/compile/b668c58d720cfc82a95388770be7c43b/staff.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
19/01/30 16:47:04 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/b668c58d720cfc82a95388770be7c43b/staff.jar
19/01/30 16:47:06 INFO tool.ImportTool: Destination directory /user/company is not present, hence not deleting.
19/01/30 16:47:06 WARN manager.MySQLManager: It looks like you are importing from mysql.
19/01/30 16:47:06 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
19/01/30 16:47:06 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
19/01/30 16:47:06 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
19/01/30 16:47:06 INFO mapreduce.ImportJobBase: Beginning import of staff
19/01/30 16:47:06 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
19/01/30 16:47:06 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
19/01/30 16:47:06 INFO client.RMProxy: Connecting to ResourceManager at hadoop001/192.168.217.129:8032
19/01/30 16:47:07 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:08 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:09 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:10 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:11 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:12 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:13 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:14 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:15 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:16 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:47 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:48 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:49 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:50 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:51 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:52 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:53 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:54 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:55 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:47:56 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:48:27 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:48:28 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:48:29 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:48:30 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:48:31 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:48:32 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:48:33 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:48:34 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:48:35 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:48:36 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:07 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:08 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:09 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:10 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:11 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:12 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:13 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:14 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:15 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:16 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:47 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:48 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:49 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:50 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:51 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:52 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:53 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:54 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:55 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:49:56 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:50:27 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:50:28 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:50:29 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:50:30 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:50:31 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:50:32 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:50:33 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:50:34 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:50:35 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:50:36 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:07 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:08 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:09 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:10 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:11 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:12 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:13 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:14 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:15 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:16 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:47 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:48 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:49 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:50 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:51 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:52 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:53 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:54 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:55 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:51:56 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:52:27 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:52:28 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:52:29 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:52:30 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:52:31 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:52:32 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:52:33 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:52:34 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:52:35 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:52:36 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:07 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:08 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:09 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:10 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:11 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:12 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:13 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:14 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:15 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:16 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:47 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:48 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:49 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:50 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:51 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:52 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:53 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:54 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:55 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:53:56 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:54:27 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:54:28 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:54:29 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:54:30 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:54:31 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:54:32 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:54:33 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:54:34 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:54:35 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
19/01/30 16:54:36 INFO ipc.Client: Retrying connect to server: hadoop001/192.168.217.129:8032. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
上一篇: 硕彦博创学习记录——2018.7.20 & 2018.7.23(指针)
下一篇: 1 Tow Sum