时间:2021-07-01 10:21:17 帮助过:11人阅读
靠!sqoop2的文档太少了,而且居然不支持Hbase,十分简陋,所以我愤而放弃Sqoop2转为使用Sqoop1,之前跟着我教程看到朋友不要拿砖砸我,我是也是不知情的群众 卸载sqoop2 这步可选,如果你们是照着我之前的教程你们已经装了sqoop2就得先卸载掉,没装的可以跳
靠!sqoop2的文档太少了,而且居然不支持Hbase,十分简陋,所以我愤而放弃Sqoop2转为使用Sqoop1,之前跟着我教程看到朋友不要拿砖砸我,我是也是不知情的群众
这步可选,如果你们是照着我之前的教程你们已经装了sqoop2就得先卸载掉,没装的可以跳过这步
$ sudo su - $ service sqoop2-server stop $ yum -y remove sqoop2-server $ yum -y remove sqoop2-client
yum install -y sqoop
# sqoop help Warning: /usr/lib/sqoop/../hive-hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 14/11/28 11:33:11 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.1 usage: sqoop COMMAND [ARGS] Available commands: codegen Generate code to interact with database records create-hive-table Import a table definition into Hive eval Evaluate a SQL statement and display the results export Export an HDFS directory to a database table help List available commands import Import a table from a database to HDFS import-all-tables Import tables from a database to HDFS job Work with saved jobs list-databases List available databases on a server list-tables List available tables in a database merge Merge results of incremental imports metastore Run a standalone Sqoop metastore version Display version information See 'sqoop help COMMAND' for information on a specific command.
mysql jdbc 驱动下载地址
下载后,解压开找到驱动jar包,upload到服务器上,然后移过去
mv /home/alex/mysql-connector-java-5.1.34-bin.jar /usr/lib/sqoop/lib
CREATE TABLE `employee` ( `id` int(11) NOT NULL, `name` varchar(20) NOT NULL, PRIMARY KEY (`id`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8;
insert into employee (id,name) values (1,'michael'); insert into employee (id,name) values (2,'ted'); insert into employee (id,name) values (3,'jack');
# sqoop list-databases --connect jdbc:mysql://localhost:3306/sqoop_test --username root --password root Warning: /usr/lib/sqoop/../hive-hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 14/12/01 09:20:28 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.1 14/12/01 09:20:28 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 14/12/01 09:20:28 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. information_schema cacti metastore mysql sqoop_test wordpress zabbix
# sqoop list-tables --connect jdbc:mysql://localhost/sqoop_test --username root --password root Warning: /usr/lib/sqoop/../hive-hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 14/11/28 11:46:11 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.1 14/11/28 11:46:11 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 14/11/28 11:46:11 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. employee student workers
# sqoop list-tables --connect jdbc:mysql://localhost/sqoop_test --username root --password root --driver com.mysql.jdbc.Driver
sqoop import --connect jdbc:mysql://localhost:3306/sqoop_test --username root --password root --table employee --m 1 --target-dir /user/test3
# sqoop import --connect jdbc:mysql://localhost:3306/sqoop_test --username root --password root --table employee --m 1 --target-dir /user/test Warning: /usr/lib/sqoop/../hive-hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 14/12/01 14:15:41 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.1 14/12/01 14:15:41 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 14/12/01 14:15:41 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 14/12/01 14:15:41 INFO tool.CodeGenTool: Beginning code generation 14/12/01 14:15:42 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1 14/12/01 14:15:42 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1 14/12/01 14:15:42 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce Note: /tmp/sqoop-root/compile/7b8091924ce8deb4f2ccae14c404a5bf/employee.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 14/12/01 14:15:45 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/7b8091924ce8deb4f2ccae14c404a5bf/employee.jar 14/12/01 14:15:45 WARN manager.MySQLManager: It looks like you are importing from mysql. 14/12/01 14:15:45 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 14/12/01 14:15:45 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 14/12/01 14:15:45 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 14/12/01 14:15:45 INFO mapreduce.ImportJobBase: Beginning import of employee 14/12/01 14:15:46 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar 14/12/01 14:15:47 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps 14/12/01 14:15:47 INFO client.RMProxy: Connecting to ResourceManager at xmseapp01/10.172.78.111:8032 14/12/01 14:15:50 INFO db.DBInputFormat: Using read commited transaction isolation 14/12/01 14:15:51 INFO mapreduce.JobSubmitter: number of splits:1 14/12/01 14:15:51 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1406097234796_0019 14/12/01 14:15:52 INFO impl.YarnClientImpl: Submitted application application_1406097234796_0019 14/12/01 14:15:52 INFO mapreduce.Job: The url to track the job: http://xmseapp01:8088/proxy/application_1406097234796_0019/ 14/12/01 14:15:52 INFO mapreduce.Job: Running job: job_1406097234796_0019 14/12/01 14:16:08 INFO mapreduce.Job: Job job_1406097234796_0019 running in uber mode : false 14/12/01 14:16:08 INFO mapreduce.Job: map 0% reduce 0% 14/12/01 14:16:19 INFO mapreduce.Job: map 100% reduce 0% 14/12/01 14:16:20 INFO mapreduce.Job: Job job_1406097234796_0019 completed successfully 14/12/01 14:16:21 INFO mapreduce.Job: Counters: 30 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=99855 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=87 HDFS: Number of bytes written=16 HDFS: Number of read operations=4 HDFS: Number of large read operations=0 HDFS: Number of write operations=2 Job Counters Launched map tasks=1 Other local map tasks=1 Total time spent by all maps in occupied slots (ms)=8714 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=8714 Total vcore-seconds taken by all map tasks=8714 Total megabyte-seconds taken by all map tasks=8923136 Map-Reduce Framework Map input records=2 Map output records=2 Input split bytes=87 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=58 CPU time spent (ms)=1560 Physical memory (bytes) snapshot=183005184 Virtual memory (bytes) snapshot=704577536 Total committed heap usage (bytes)=148897792 File Input Format Counters Bytes Read=0 File Output Format Counters Bytes Written=16 14/12/01 14:16:21 INFO mapreduce.ImportJobBase: Transferred 16 bytes in 33.6243 seconds (0.4758 bytes/sec) 14/12/01 14:16:21 INFO mapreduce.ImportJobBase: Retrieved 2 records.
# hdfs dfs -ls /user/test Found 2 items -rw-r--r-- 2 root supergroup 0 2014-12-01 14:16 /user/test/_SUCCESS -rw-r--r-- 2 root supergroup 16 2014-12-01 14:16 /user/test/part-m-00000 # hdfs dfs -cat /user/test/part-m-00000 1,michael 2,ted
14/12/01 10:12:42 INFO mapreduce.Job: Task Id : attempt_1406097234796_0017_m_000000_0, Status : FAILED Error: employee : Unsupported major.minor version 51.0用ps aux| grep hadoop看下会发现hadoop用的是jdk1.6 。我的cdh是5.0.1 sqoop版本是 1.4.4 ,我遇到了这个问题。
for x in `cd /etc/init.d ; ls hive-*` ; do sudo service $x stop ; done for x in `cd /etc/init.d ; ls hbase-*` ; do sudo service $x stop ; done /etc/init.d/zookeeper-server stop for x in `cd /etc/init.d ; ls hadoop-*` ; do sudo service $x stop ; done
for x in `cd /etc/init.d ; ls hadoop-*` ; do sudo service $x start ; done /etc/init.d/zookeeper-server start for x in `cd /etc/init.d ; ls hbase-*` ; do sudo service $x start ; done for x in `cd /etc/init.d ; ls hive-*` ; do sudo service $x start ; done
truncate employee
# sqoop export --connect jdbc:mysql://localhost:3306/sqoop_test --username root --password root --table employee --m 1 --export-dir /user/test Warning: /usr/lib/sqoop/../hive-hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 14/12/01 15:16:50 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.1 14/12/01 15:16:50 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 14/12/01 15:16:51 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 14/12/01 15:16:51 INFO tool.CodeGenTool: Beginning code generation 14/12/01 15:16:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1 14/12/01 15:16:52 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1 14/12/01 15:16:52 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce Note: /tmp/sqoop-root/compile/f4a75fdefe1eb604181d47d6bc827e48/employee.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 14/12/01 15:16:55 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/f4a75fdefe1eb604181d47d6bc827e48/employee.jar 14/12/01 15:16:55 INFO mapreduce.ExportJobBase: Beginning export of employee 14/12/01 15:16:55 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar 14/12/01 15:16:57 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative 14/12/01 15:16:57 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative 14/12/01 15:16:57 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps 14/12/01 15:16:57 INFO client.RMProxy: Connecting to ResourceManager at xmseapp01/10.172.78.111:8032 14/12/01 15:17:00 INFO input.FileInputFormat: Total input paths to process : 1 14/12/01 15:17:00 INFO input.FileInputFormat: Total input paths to process : 1 14/12/01 15:17:00 INFO mapreduce.JobSubmitter: number of splits:1 14/12/01 15:17:00 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1406097234796_0021 14/12/01 15:17:01 INFO impl.YarnClientImpl: Submitted application application_1406097234796_0021 14/12/01 15:17:01 INFO mapreduce.Job: The url to track the job: http://xmseapp01:8088/proxy/application_1406097234796_0021/ 14/12/01 15:17:01 INFO mapreduce.Job: Running job: job_1406097234796_0021 14/12/01 15:17:13 INFO mapreduce.Job: Job job_1406097234796_0021 running in uber mode : false 14/12/01 15:17:13 INFO mapreduce.Job: map 0% reduce 0% 14/12/01 15:17:21 INFO mapreduce.Job: Task Id : attempt_1406097234796_0021_m_000000_0, Status : FAILED Error: java.io.IOException: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown database 'sqoop_test' at org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:79) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.(MapTask.java:624) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:744) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown database 'sqoop_test' at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at com.mysql.jdbc.Util.handleNewInstance(Util.java:377) at com.mysql.jdbc.Util.getInstance(Util.java:360) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:978) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3887) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3823) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:870) at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1659) at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1206) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2234) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2265) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2064) at com.mysql.jdbc.ConnectionImpl. (ConnectionImpl.java:790) at com.mysql.jdbc.JDBC4Connection. (JDBC4Connection.java:44) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at com.mysql.jdbc.Util.handleNewInstance(Util.java:377) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:395) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:325) at java.sql.DriverManager.getConnection(DriverManager.java:571) at java.sql.DriverManager.getConnection(DriverManager.java:215) at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302) at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter. (AsyncSqlRecordWriter.java:76) at org.apache.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter. (ExportOutputFormat.java:95) at org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:77) ... 8 more 14/12/01 15:17:29 INFO mapreduce.Job: Task Id : attempt_1406097234796_0021_m_000000_1, Status : FAILED Error: java.io.IOException: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown database 'sqoop_test' at org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:79) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector. (MapTask.java:624) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:744) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown database 'sqoop_test' at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at com.mysql.jdbc.Util.handleNewInstance(Util.java:377) at com.mysql.jdbc.Util.getInstance(Util.java:360) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:978) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3887) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3823) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:870) at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1659) at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1206) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2234) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2265) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2064) at com.mysql.jdbc.ConnectionImpl. (ConnectionImpl.java:790) at com.mysql.jdbc.JDBC4Connection. (JDBC4Connection.java:44) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at com.mysql.jdbc.Util.handleNewInstance(Util.java:377) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:395) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:325) at java.sql.DriverManager.getConnection(DriverManager.java:571) at java.sql.DriverManager.getConnection(DriverManager.java:215) at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302) at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter. (AsyncSqlRecordWriter.java:76) at org.apache.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter. (ExportOutputFormat.java:95) at org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:77) ... 8 more 14/12/01 15:17:40 INFO mapreduce.Job: map 100% reduce 0% 14/12/01 15:17:41 INFO mapreduce.Job: Job job_1406097234796_0021 completed successfully 14/12/01 15:17:41 INFO mapreduce.Job: Counters: 32 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=99542 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=139 HDFS: Number of bytes written=0 HDFS: Number of read operations=4 HDFS: Number of large read operations=0 HDFS: Number of write operations=0 Job Counters Failed map tasks=2 Launched map tasks=3 Other local map tasks=2 Rack-local map tasks=1 Total time spent by all maps in occupied slots (ms)=21200 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=21200 Total vcore-seconds taken by all map tasks=21200 Total megabyte-seconds taken by all map tasks=21708800 Map-Reduce Framework Map input records=2 Map output records=2 Input split bytes=120 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=86 CPU time spent (ms)=1330 Physical memory (bytes) snapshot=177094656 Virtual memory (bytes) snapshot=686768128 Total committed heap usage (bytes)=148897792 File Input Format Counters Bytes Read=0 File Output Format Counters Bytes Written=0 14/12/01 15:17:41 INFO mapreduce.ExportJobBase: Transferred 139 bytes in 43.6687 seconds (3.1831 bytes/sec) 14/12/01 15:17:41 INFO mapreduce.ExportJobBase: Exported 2 records.
mysql> select * from employee; +----+---------+ | id | name | +----+---------+ | 1 | michael | | 2 | ted | +----+---------+ 2 rows in set (0.00 sec)