Lately we have created several data marts in Azure SQL that will be fed from a data warehouse hosted on premises. The data mart is first created and tested locally so that we can get the structure and required stored procs correct. Whilst we are doing this it is easy to quickly transfer representative data to test the efficiency. Every now and then though we need to test everything on Azure so that we can be sure we are still operating at an acceptable speed. This is where the problems begin. Linked servers are a no-go if you are transferring large amounts of data to Azure which will treat every insert as a separate transaction. We could back up the database, copy it to blob storage and restore from there but that takes a while as well and sometimes it is just a single large staging table that we want to populate. We could hand craft an SSIS package which would be fast but seems overkill considering the amount of times we adjust and retest.

The answer is actually quite simple thanks to a little promoted feature of SSMS. Follow the steps below to see how

  1. Right click the database you wish to export data from
  2. Select ‘Tasks’ then ‘Export Data’Tasks-ExportData
  3. You will see theSQL Server Import and Export Wizard. Click ‘Next’ to go past the welcome screen and make sure your source is correctly selected (it will have been if you started by right clicking the database), then click ‘Next’ againYourSource
  4. Now you can choose the destination database, note that for Azure you need to select ‘Net Framework Data Provider for SqlServer’ in the destination drop down
  5. This will cause connection properties to appear. For Azure you need to set the data source,userid, password andinitial catalog. I would also recommend setting connection encryption as I have below.AzureConnection
  6. You can now either copy data directly from a table or view or write your own query. Such power! Here I have elected to copy data from a table in the source database, but bear in mind that you could easily write a query to limit the number of rows or transform the data in some way.SelectTable
  7. Choose the table or tables with data you want to copy. The wizard will automatically try to choose a matching table on your destination database or suggest auto creating it if it does not exist. Click ‘Edit Mappings’ for each table you are transferring to see where your data will go.SelectTablesToExport
    Here I have chosen two tables that exist in both source and destination and have the same structure and a third table that is different on the destination database. By highlighting that row and clicking ‘Edit Mappings’ you can choose exactly where your data will go on a row by row basis. Again, the wizard will auto suggest the best fits.Mappings
    Here there are no columns that are named the same so the wizard leaves it entirely up to you. You can also choose to preview the data.
  8. Having mapped your source to your destination, click ‘Next’. Thefinal step is to either run the package immediately or save it as a package toSSIS. I usually run immediately if it is just a quick transfer I am after. Here I have selected both.RunOrSave
  9. Finally you can click ‘Finish’ to execute the package. Note that you have created an SSIS package and you will see feedback on the progress as it goes, including number of rows transferred. If it encounters any problems it will stop and tell you.

Overall an excellent way to quickly transfer data from one system to another. Especially useful where you need to transform that data, map between two different schemas or send it to Azure.

What are you waiting for? Go Play!


Leave a Reply