当前位置:Gxlcms > 数据库问题 > sql server 导出数据到 Azure Hbase / Hive 详细步骤

sql server 导出数据到 Azure Hbase / Hive 详细步骤

时间:2021-07-01 10:21:17 帮助过:28人阅读

Goals

In this tutorial, you see three things:

  1. How to set up a SQL database on Windows Azure for use with the tutorial.

  2. How to use the Remote Desktop feature in Hadoop on Azure to access the head node of the HDFS cluster.

  3. How to import relational data from SQL Server to a Hadoop on Azure HDFS cluster by using Sqoop.

Key technologies

  • Windows Azure 技术分享
  • Hadoop on Azure 技术分享
  • Sqoop 技术分享

Setup and Configuration

You must have an account to access Hadoop on Azure and have created a cluster to work through this tutorial. To obtain an account and create an Hadoop cluster, follow the instructions outlined in the Getting started with Microsoft Hadoop on Azure section of the Introduction to Hadoop on Azure topic.

You will also need your outward facing IP address for your current location when configuring your firewall on SQL Database. To obtain it, go to the site WhatIsMyIP 技术分享 and make a note of it. Later in the procedure, you also need the outward facing IP address for the head of the Hadoop cluster. You can obtain this IP address in the same way.


Tutorial

This tutorial is composed of the following segments:

  1. How to set up a SQL database.

  2. How to use Sqoop from Hadoop on Azure to import data to the HDFS cluster.

How to set up a SQL database

Log in into your Windows Azure account. To create a database server, click the Database icon in the lower left-hand corner on the page.

技术分享

On the Getting Started page, click the Create a new SQL Database Server option.

技术分享

Select the type of subscription (such as Pay-As-You-Go) associated with you account in the Create Server window and press Next.

技术分享

Select the appropriate Region in the Create Server window and click Next.

技术分享

Specify the login and password of the server-level principal of your SQL Database server and then press Next.

技术分享

Press Add to specify a firewall rule that allows your current location access to SQL Database to upload the AdventureWorks database. The firewall grants access based on the originating IP address of each request. Use the IP address found with the configuration preliminaries of this tutorial for the values to add. Specify a Rule name, such as shown, but remember to use your IP address, not the one used for illustration purposes below. (You must also add the outward IP address of the head node in you Hadoop cluster. If you know it already, add it now.) Then press the Finish button.

技术分享

Download the AdventureWorks2012 database onto your local machine from Recommended Downloads link on the Adventure Works for SQL Database site.

技术分享

Unzip the file, open an Administrator Command Prompt, and navigate to the AdventureWorks directory inside the AdventureWorks2012ForSQLAzure folder.

Run CreateAdventureWorksForSQLAzure.cmd by typing the following:

CreateAdventureWorksForSQLAzure.cmd servername username password

For example, if the assigned SQL Database server is named b1gl33p, the administrator user name "Fred", and the password "Secret", you would type the following:

CreateAdventureWorksForSQLAzure.cmd b1gl33p.database.windows.net Fred@b1gl33p Secret

The script creates the database, installs the schema, and populates the database with sample data.

Return to the WindowsAzurePlatform portal page, click your subscription on the left-hand side (Pay-As-You-Go in the example below) and select your database (here named wq6xlbyoq0). The AventureWorks2012 should be listed in the Database Name column. Select it and press the Manage icon at the top of the page.

技术分享

Enter the credentials for the SQL database when prompted and press Log on.

技术分享

This opens the Web interface for the Adventure Works database on SQL Database. Press the New Query icon at the top to open the query editor.

技术分享

Since Sqoop currently adds square brackets to the table name, we need to add a synonym to support two-part naming for SQL Server tables. To do so, run the following query:

CREATE SYNONYM [Sales.SalesOrderDetail] FOR Sales.SalesOrderDetail

Run the following query and review its result.

select top 200 * from [Sales.SalesOrderDetail]

技术分享

How to use Sqoop from Hadoop on Azure to import SQL Database query results to the HDFS cluster in Hadoop On Azure.

From your Account page, scroll down to the Open Ports icon in the Your cluster section and click the icon to open the ODBC Server port on the head node in your cluster.

技术分享

技术分享

Return to your Account page, scroll down to the Your cluster section and click the Remote Desktop icon this time to open the head node in your cluster.

Select Open when prompted to open the .rdp file.

技术分享

Select Connect in the Remote Desktop Connection window.

技术分享

Enter your credentials for the Hadoop cluster (not your Hadoop on Azure account) into the Windows Security window and select OK.

技术分享

Open Internet Explorer and go to the site WhatIsMyIP 技术分享 to obtain the outward facing IP address for the head node of the cluster. Return the SQL Database management page and add a firewall rule that allows your Hadoop cluster access to SQL Database. The firewall grants access based on the originating IP address of each request.

Double-click on the Hadoop Command Shell icon in the upper left hand of the Desktop to open it. Navigate to the "c:\Apps\dist\sqoop\bin" directory and run the following command:

sqoop import --connect "jdbc:sqlserver://[serverName].database.windows.net;username=[userName]@[serverName];password=[password];database=AdventureWorks2012" --table Sales.SalesOrderDetail --target-dir /data/lineitemData -m 1

So, for example, for the following values:
* server name: wq6xlbyoq0
* username: HadoopOnAzureSqoopAdmin
* password: Pa$$w0rd

The sqoop command is:

sqoop import --connect "jdbc:sqlserver://wq6xlbyoq0.database.windows.net;username=HadoopOnAzureSqoopAdmin@wq6xlbyoq0;password=Pa$$w0rd;;database=AdventureWorks2012" --table Sales.SalesOrderDetail --target-dir /data/lineitemData -m 1

Return to the Accounts page of the Hadoop on Azure portal and open the Interactive Console this time. Run the #lsr command from the JavaScript console to list the files and directories on your HDFS cluster. 

技术分享

Run the #tail command to view selected results from the part-m-0000 file.

tail /user/RAdmin/data/SalesOrderDetail/part-m-00000

技术分享


Summary

In this tutorial, you have seen how to use Sqoop to import data from a SQL database on Windows Azure to an Hadoop on Azure HDFS cluster.

sql server 导出数据到 Azure Hbase / Hive 详细步骤

标签:

人气教程排行