The Great Azure DevOps Migration – Part 6: Import

wagons migrating west

This is it! We’ve made it to the import step! This is when we finally move our data into Azure DevOps Service.

If you missed the earlier posts, start here.

I highly recommend Microsoft’s Azure DevOps Service Migration Guide.

Detach Collection

First, you need to detach the collection from TFS. Don’t detach the database in SQL Server, but detach the collection in the Azure DevOps Server.

To detach the collection, open the Azure DevOps Management Tool, go to Collections, and choose Detach on the collection that is going to be imported.

Generate the Database Backup

If you have managed to keep your import under 30 GB, this step is fairly easy. If not, you are in for a harder import because you now need to move your database to a SQL Server Database in Azure. I won’t cover the SQL Server migration as I did not do this step, but here is the guide on how to do this.

So, if you are going the under 30 GB route, you need to create a DACPAC that is going to be imported to Azure DevOps Service. You should be able to run the DACPAC tool from your Developer Command Prompt for Visual Studio or from the following location:

C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\150

Here is the packaging command:

SqlPackage.exe /sourceconnectionstring:”Data Source=localhost;
Initial Catalog=[COLLECTION_NAME];Integrated Security=True”
/targetFile:C:\dacpac\Tfs_DefaultCollection.dacpac
/action:extract
/p:ExtractAllTableData=true
/p:IgnoreUserLoginMappings=true
/p:IgnorePermissions=true
/p:Storage=Memory

After the packaging is completed, you will have a new DACPAC at C:\dacpac\ with all your import data.

Upload the Package

We’re not going to upload the package directly into Azure DevOps Service. First, we need to upload it to Azure itself. And then we’ll point Azure DevOps Service at the DACPAC in Azure.

The easiest way to do this is to install the Azure Storage Explorer.

Open the Azure Storage Explorer app.
Choose Add Azure Account.
Login to your Azure Account.
Go to Azure Storage Container.
Create a new Blob Container named DACPAC.
Upload the DACPAC file created by SqlPackage.exe.

Create the SAS Key

You need to create a secret key that will allow Azure DevOps Service to access the DACPAC.

In Azure Storage Explorer, right-click the DACPAC folder and choose Get Shared Access Signature…

Set the expiration to one week from today.
Give it read/list rights, nothing else.
Copy the URL for the SAS Key.

This SAS URL should be placed in the import.json file that was in the Logs folder from earlier. Set it in the Source.Location field.

Import

That’s it! We are ready to start the import!

Run the following command from the Data Migration Tool folder:

Run Migrate import /importfile:[IMPORT-JSON-LOCATION]

The import will begin and the command will provide a link to view the status of your import.

It does take a few minutes before you can even see the import page, so don’t panic.

Once the import began, it took about two hours to complete… so this is a good time to take a break.

Validation

You did it! Your migration to Azure DevOps is completed. You should now verify that everything is working correctly.

Users

First, verify your list of users. You can find your users in the Organization Settings. I had to eliminate a lot of users that did not need access to the service. You should then set the correct Access Level for your actual users. We have a number of VS Enterprise subscriptions that I used for most of my developers, and our contractors received Basic access. Most importantly, make sure all users are listed that should be.

This is a great chance to see how much Azure DevOps Service is actually going to cost you, so make sure you set this up just like your Production environment will be.

Source Control

Because you moved your GIT source control, you don’t actually need to re-clone it, you can just redirect your existing local repo to the new location.

You can change your local repo origin with the following command (you can find the REMOTE_GIT_REPO in the Clone button in Azure DevOps Service – Repos – Files).

git remote set-url origin [REMOTE_GIT_REPO]

Billing

Make sure your Billing Account is configured for the service. When you do your Production migration, this is important. You won’t be billed till the first of the next month, so make sure you have Billing and Users setup by the end of the month.

Build / Release Agents

Any local Build / Release agents will need to be reconfigured. I only had about 10 agents running locally, so I chose to just remove them and reinstall them after the final Production run. The Powershell command makes this very easy.

I did not test this with the Dry Run, I simply reconfigued it after the Production migration and everything worked smoothly.

Final Import

And that is it!
We had very few other issues, the dry run went well and the Production migration a few weeks later went very smoothly.

For the final migration, I simply repeated the steps of this Guide and changed the import.json to use Production instead of Dry-Run.

I turned off our local TFS server and am keeping it around but off in case we need the legacy code.

The main thing that came up after final migration was setting Permissions for Users correctly, but I simply adjusted these settings as we went.

Some users had issues with non-Visual Studio tools being unable to connect to the remote repo, but setting their GIT Credentials in Azure DevOps Service – Repos – Files – Clone fixed the issue.

I hope you have learned from my efforts and if you any questions let me know!

The Great Azure DevOps Migration – Part 5: Prepare

wagons migrating west

We’ve validated that our data is ready for import. Now, we need to prepare the data to be imported!
This is a short step, so let’s enjoy the ease of this one.

If you missed the earlier posts, start here.

I highly recommend Microsoft’s Azure DevOps Service Migration Guide.

Prepare Command

In the same way that we used the Migrator validate command earlier, we need to run a Migrator prepare command. This re-runs the validation but also creates a .json file that will be used for the actual import process.

So, open Powershell to the directory that contains the Migrator.exe file (in the DataMigrationTool download). Execute the cmd below:

Migrator prepare /collection:[COLLECTION_ADDRESS] tenantdomainname:[AZURE_TENANT_NAME] /region:CUS

I recommend using the localhost address to your collection to verify that you are pointed at the right server. The tenant domain name is the Azure Active Directory that it will connect to for your newly imported data. The region must be from a narrow list of Azure regions, make sure you choose a supported region. View the full list here.

Execute the command and you will see results similar to the validation run earlier.

If all goes well, you will find the new import.json file in the Logs folder of the DataMigrationTool. Inside Logs, open the newest folder, and open the import.json file in a text editor.

There are a bunch of fields in this file, but we only care about the ones at the very top. Update the following fields:
Target.Name – The name of your organization that will be created at Azure DevOps.
Properties.ImportType – DryRun for this initial test.
The two source fields will be updated in the next post.

Azure Storage

Next, you need to setup an Azure Storage Container. This is the location you will move the file containing all of your TFS data to before importing it into Azure DevOps Service.

In Azure, you just need to create a new Standard Storage Container. This container has to be created in the same data center region as you set in the import.json file. So make sure you pay attention to that!

I simply created a Standard Storage Container in Central US, easy.

What’s Next?

We’re so close! Our data is now prepared for import!

In the next step, we’ll push the data to the Storage container and begin the import process!