Set-AzureStorageAccount incorrectly sets Geo-Replication States

There has been a bug identified in the Set-AzureStorageAccount cmdlet that could inadvertently enable or disable your storage account geo-replication settings.
Those who have used this cmdlet should check the geo-replication for their accounts in the Azure Portal immediately.

Note: For more information on Geo-Replication in Windows Azure Storage please visit the following post:

Scenario 1: Changing the label or description of your storage account without specifying -GeoReplicationEnabled will disable geo-replication

PS C:\> Set-AzureStorageAccount -StorageAccountName mwweststorage -Label “updated label”

StorageAccountDescription : 
AffinityGroup             : 
Location                  : West US 
GeoReplicationEnabled     : False
GeoPrimaryLocation        : West US
GeoSecondaryLocation      : 
Label                     : "updated label"
StorageAccountStatus      : Created
StatusOfPrimary           : 
StatusOfSecondary         : 
Endpoints                 : {,, 
StorageAccountName        : mwweststorage
OperationDescription      : Get-AzureStorageAccount
OperationId               : 8dc5e76c-e8ac-460f-a76c-5a0c6f96e2c6
OperationStatus           : Succeeded

Scenario 2: Setting geo-replication to disabled in your storage account via PowerShell will actually enable it.

PS C:\> Set-AzureStorageAccount -StorageAccountName mwweststorage -GeoReplicationEnabled $false -Label “disabled geo replication”

StorageAccountDescription : 
AffinityGroup             : 
Location                  : West US 
GeoReplicationEnabled     : True
GeoPrimaryLocation        : West US
GeoSecondaryLocation      : 
Label                     : "disabled geo replication"
StorageAccountStatus      : Created
StatusOfPrimary           : 
StatusOfSecondary         : 
Endpoints                 : {,, 
StorageAccountName        : mwweststorage
OperationDescription      : Get-AzureStorageAccount
OperationId               : 8dc5e76c-e8ac-460f-a76c-5a0c6f96e2c6
OperationStatus           : Succeeded

Mitigation Strategy:

  • From PowerShell
  • To enable: Pass GeoReplicationEnabled $true
  • To disable: Do not pass GeoReplicationEnabled
  • From the Windows Azure Management Portal
  • Click on your Storage Account
  • Click on Configure
  • Verify the geo-replication configuration

What are we in the Windows Azure PowerShell Team(s) doing about it?

We will be releasing an updated version of the current release (0.6.13) to include a fix in the Set-AzureStorageAccount cmdlet in the very near future.
This will be a BREAKING CHANGE. Note: We have determined a solution that will not be a breaking change.

Update – 4/23/2013

The Windows Azure PowerShell cmdlets have been updated with this fix. Please download the latest and verify your Storage Account settings.

Windows Azure PowerShell Updates for IaaS GA

With the release of Windows Azure Virtual Machines and Virtual Networks into general availability the Windows Azure PowerShell team has been working feverishly to provide an even more powerful automation experience for deploying virtual machines in the cloud.

Remote PowerShell on Windows Azure – Automating Virtual Machines

One of the key requests we have heard from customers is to go beyond the current capabilities of automated infrastructure provisioning and allow the user to bootstrap a virtual machine as part of a fully automated deployment.

With this release we are announcing that Remote PowerShell will be enabled by default on Windows based virtual machines created with the latest version of the Windows Azure PowerShell Cmdlets.

Enabling Remote PowerShell allows a user to create a virtual machine and on boot immediately launch a script to bootstrap whatever configuration is desired. This could be installing and configuring Windows Roles and Features all the way to downloading and deploying an application or website. Authentication is over SSL for security and you can use your own certificate or we can even generate one for you. In addition to the bootstrapping abilities Remote PowerShell allows you to write powerful scripts for remote management and automation that can be ran at any time after the virtual machine is booted. The same scripts you use to manage your on-premises servers will work with your servers in Windows Azure. Of course, we do provide a switch to disable this functionality on boot if Remote PowerShell is not desired.

Installing Windows Server Features Automatically

In the example below the new -WaitForBoot parameter is used with New-AzureVM. This switch tells the cmdlet to wait for the virtual machine to be in the RoleReady (booted) state before continuing execution. Once the virtual machine is ready the script calls the Get-AzureWinRMUri cmdlet to retrieve the connection string to execute a remote script against the virtual machine. The script block passed to Invoke-Command installs the Web-Server IIS and the related management tools.

A PowerShell scripter could easily extend this script to automatically deploy a custom web application or service with just a few additional lines of code.

Installing Windows Features using Remote PowerShell

# Using this script installs the generated cert into your local cert store which allows 
# PowerShell to verify it is communicating with the correct endpoint. 
# This REQUIRES PowerShell run Elevated
. "C:\Scripts\WAIaaSPS\RemotePS\InstallWinRMCert.ps1" 

$user = ""
$pwd = ""
$svcName = ""
$VMName = "webfe1" 
$location = "West US"

$credential = Get-Credential 

New-AzureVMConfig -Name $VMName -InstanceSize "Small" -ImageName $image |
                Add-AzureProvisioningConfig -Windows -AdminUsername $user -Password $pwd |
                Add-AzureEndpoint -Name "http" -Protocol tcp -LocalPort 80 -PublicPort 80 |
                New-AzureVM -ServiceName $svcName -Location $location -WaitForBoot 

# Get the RemotePS/WinRM Uri to connect to
$uri = Get-AzureWinRMUri -ServiceName $svcName -Name $VMName 

# Using generated certs – use helper function to download and install generated cert.
InstallWinRMCert $svcName $VMName 

# Use native PowerShell Cmdlet to execute a script block on the remote virtual machine
Invoke-Command -ConnectionUri $uri.ToString() -Credential $credential -ScriptBlock {
    $logLabel = $((get-date).ToString("yyyyMMddHHmmss"))
    $logPath = "$env:TEMP\init-webservervm_webserver_install_log_$logLabel.txt"
    Import-Module -Name ServerManager
    Install-WindowsFeature -Name Web-Server -IncludeManagementTools -LogPath $logPath

Contents of InstallWinRMCert.ps1

function InstallWinRMCert($serviceName, $vmname)
    $winRMCert = (Get-AzureVM -ServiceName $serviceName -Name $vmname | select -ExpandProperty vm).DefaultWinRMCertificateThumbprint

    $AzureX509cert = Get-AzureCertificate -ServiceName $serviceName -Thumbprint $winRMCert -ThumbprintAlgorithm sha1

    $certTempFile = [IO.Path]::GetTempFileName()
    Write-Host $certTempFile
    $AzureX509cert.Data | Out-File $certTempFile

    # Target The Cert That Needs To Be Imported
    $CertToImport = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2 $certTempFile

    $store = New-Object System.Security.Cryptography.X509Certificates.X509Store "Root", "LocalMachine"

    Remove-Item $certTempFile

Image and Disk Mobility

Windows Azure is an open computing platform and allows for the movement of your virtual machine disks between on-premises and the cloud. There are two optimized cmdlets that enable either you to upload your VHD or download it.

Uploading a VHD

The first example shows how to upload a VHD to Windows Azure. This can be a bootable OS disk or simply a data disk (remove -OS Windows for data disks). After Add-AzureDisk is called you could use the New-AzureVMConfig cmdlet or the management portal to provision a virtual machine that boots off of the uploaded VHD.

$source = "C:\vmstorage\myosdisk.vhd"
$destination = "https://<yourstorage>"

Add-AzureVhd -LocalFilePath $source -Destination $destination -NumberOfUploaderThreads 5
Add-AzureDisk -DiskName 'myosdisk' -MediaLocation $destination -Label 'mydatadisk' -OS Windows 

Downloading a VHD

Not only can you upload a disk to Windows Azure but it is also easy to download a VHD as well! The example below shows how you can save a VHD to the local file system ready to run on a Hyper-V enabled system. (Note: a virtual machine should not write to the VHD at the same time you are trying to download it).

$source = "https://<yourstorage>"
$destination = "C:\vmstorage\myosdisk.vhd"
Save-AzureVhd -Source $source -LocalFilePath $destination -NumberOfThreads 5 

VMDK Conversion and Migration to Windows Azure

If you have VMWare based virtual machines that you would like to migrate you can use the Microsoft Virtual Machine Converter Solution Accelerator to convert the disks to VHDs and then use the Add-AzureVHD cmdlet to upload the VHD and create a virtual machine in Windows Azure from it.

Copying a VHD across Windows Azure Regions

# Source VHD (West US)
$srcUri = "http://<yourweststorage>"      

# Target Storage Account (East US)
$storageAccount = "<youreaststorage>"
$storageKey = "<youreaststoragekey>"

$destContext = New-AzureStorageContext  –StorageAccountName $storageAccount `
                                        -StorageAccountKey $storageKey  

# Container Name
$containerName = "vhds"

New-AzureStorageContainer -Name $containerName -Context $destContext

$blob = Start-AzureStorageBlobCopy -srcUri $srcUri `
                                   -DestContainer $containerName `
                                   -DestBlob "testcopy1.vhd" `
                                   -DestContext $destContext   
$blob | Get-AzureStorageBlobCopyState 

Enhanced Security -AdminUserName is required for Windows (Breaking Change)

In order to protect you from unwanted attacks from connections attempting to use the dictionary on your password, we have made it mandatory to supply a username.
This change affects the New-AzureQuickVM and the Add-AzureProvisioningConfig cmdlets used for VM creation. Each will have a new –AdminUserName parameter that is now required.
Make sure you can remember it but do not use obvious names like Administrator or Admin.

High Memory Virtual Machine Support

The latest version of the WA PowerShell Cmdlets now support the new higher memory SKU sizes of A6 and A7 for larger workloads. For more information about Windows Azure compute sizes see the following:


Managing Availability Sets on Deployed VMs

We have also added the ability to specify availability set configuration for groups of virtual machines for highly available configurations. Previously, this could only be set at deployment time or post deployment from the Windows Azure Management Portal. For more information on availability sets see the following article:

Get-AzureVM -ServiceName "mywebsite" | Where {$_.Name -like "*web*"} | 
    Set-AzureAvailabilitySet -AvailabilitySetName "wfe-av-set" |

Wrapping Up

I hope you are excited about the new features in the Windows Azure PowerShell Cmdlets. If you would like to try this yourself you will need a subscription, to download the WA PowerShell Cmdlets and a short read on getting started.

Michael Washam
Senior Program Manager – Windows Azure

Speaking at MMS 2013

If you will be attending MMS 2013 next week in Las Vegas and want to learn more about automating the cloud with the Windows Azure PowerShell Cmdlets please stop by my session!

WS-B311 Take Control of the Cloud with the Windows Azure PowerShell Cmdlets

In this session you will learn how to tame the simplest to most advanced Windows Azure deployments with the Windows Azure PowerShell cmdlets. Automate repetitive tasks and learn how build advanced reproducible deployments so you can save time and money while working in the cloud. The presenter will cover dev-ops scenarios with Virtual Machines and Cloud Services in this demo packed session.

Windows Azure PowerShell Cmdlets Now Supports Storage!

The Windows Azure Storage team has delivered an outstanding set of PowerShell cmdlets for managing and using storage from PowerShell.

The new abilities include the ability to create and manage containers and blobs which includes the ability to asynchronously copy blobs (across storage accounts and across regions!).

A quick example of how to kick off async blob copies is below. The cmdlets were designed to allow multiple blob copies to start and then to monitor the results at a later time. This allows the end user to take advantage of the async nature of the APIs instead.

Import-Module Azure
Select-AzureSubscription mysubscription

$destContext = New-AzureStorageContext  –StorageAccountName $storageAccount `
                                        -StorageAccountKey $storageKey

New-AzureStorageContainer -Name $containerName

$blob1 = Start-CopyAzureStorageBlob -srcUri $srcUri1 `
                                 -DestContainer $containerName `
                                 -DestBlob $fileName1 `
                                 -DestContext $destContext 

$blob2 = Start-CopyAzureStorageBlob -srcUri $srcUri2 `
                                 -DestContainer $containerName `
                                 -DestBlob $fileName2 `
                                 -DestContext $destContext 

$blob3 = Start-CopyAzureStorageBlob -srcUri $srcUri3 `
                                 -DestContainer $containerName `
                                 -DestBlob $fileName3  `
                                 -DestContext $destContext 

$blob1 | Get-AzureStorageBlobCopyState 
$blob2 | Get-AzureStorageBlobCopyState
$blob3 | Get-AzureStorageBlobCopyState

The output of the Get-AzureStorageBlobCopyState cmdlet is below:

CopyId            : 60a3c559-14f4-4b37-ae2b-80755fc072c4
CompletionTime    : 3/27/2013 10:33:29 PM +00:00
Status            : Success
Source            :
BytesCopied       : 32212255232
TotalBytes        : 32212255232
StatusDescription : 

CopyId            : 0c665b44-aa33-47db-8367-b00aa450ed78
CompletionTime    : 3/27/2013 10:33:30 PM +00:00
Status            : Success
Source            :
BytesCopied       : 32212255232
TotalBytes        : 32212255232
StatusDescription : 

CopyId            : d425fae7-c9ba-4816-a81e-d0ded84baa75
CompletionTime    : 3/27/2013 10:33:31 PM +00:00
Status            : Success
Source            :
BytesCopied       : 32212255232
TotalBytes        : 32212255232
StatusDescription : 

The complete list of storage cmdlets are below:

  • Get-AzureStorageContainerAcl
  • Get-AzureStorageBlob
  • Get-AzureStorageBlobContent
  • Get-AzureStorageBlobCopyState
  • Get-AzureStorageContainer
  • New-AzureStorageContainer
  • New-AzureStorageContext
  • Remove-AzureStorageBlob
  • Remove-AzureStorageContainer
  • Set-AzureStorageBlobContent
  • Set-AzureStorageContainerAcl
  • Start-CopyAzureStorageBlob <- will likely be renamed to Start-AzureStorageBlobCopy
  • Stop-CopyAzureStorageBlob <- will likely be renamed to Stop-AzureStorageBlobCopy

Optimized Windows Azure IaaS Disk Mobility with Save-AzureVhd

In the most recent Windows Azure PowerShell update we have introduced a new cmdlet called Save-AzureVhd to complement Add-AzureVhd.

Save-AzureVhd provides an optimized download experience where it uses the underlying Page blob APIs to only download written bytes. In other words if you have a 1TB VHD but only have a few GB of data written to it the cmdlet will only download the 2-3GB of data. The cmdlet writes the disks as fixed disks on the local disk so you will of course need the full disk size (1TB in this example) on the local storage location.

Here is a quick example to get you up and running:

# Select correct subscription - ensure CurrentStorageAccount is set!
Select-AzureSubscription mysubscription

# Source VHD Location
$source = ""

# Target VHD Location
Save-AzureVhd -Source $source -LocalFilePath "d:\LocalStorage\myosdisk.vhd"

I’m working on another code sample to download the disks of an entire VM for an offline backup. I’ll post it as soon as it is ready 🙂


Office 365 “Access denied by Business Data Connectivity” connecting to SQL Server / Azure

I haven’t wrote a blog post on SharePoint in awhile but this one seemed worthy of the effort.

I’ve been experimenting with Office 365 lately and all I wanted to do was add a connection to a Windows Azure SQL Database and surface it up as an external content type.

There is of course no straightforward documentation on how to do this but plenty of content advising that you can. I walked through the motions of creating a DB. I then went into the secure store and created a Target Application ID (Removing from User Name and Password) and then set the user name and password (a separate step).

Next I loaded SharePoint Designer (why? because I couldn’t find NEW External Content Type button anywhere else) and created a new external content type. I selected the appropriate Application ID and tried to connect and was met with the ever so useful error:  “Access denied by Business Data Connectivity”

I’ve searched for quite a while and the most common resolution is someone saying that you cannot connect to SQL Azure directly and to use WCF wrapper or some 3rd party product.

I hate writing wrappers for things that should be simple so I kept looking. I finally tracked it down and I’m blogging about it so I can look it up later when I forget and hopefully help someone else 🙂

Step 1:

In the SharePoint admin center click BCS then Manage BCS Models and External Content Types

Step 2:

Select Set Metadata Store Permissions

Step 3:

Add your user account and select appropriate permissions.

You will now be able to complete creation of the external content type. You will need to come back in here and use the Set Object Permissions once it is created..

Hope this helps!

New Role within Microsoft

I have a new job! 🙂

As of 1/28/2013 I will move over to the Windows Azure Runtime team as a Senior Program Manager. So instead of just talking about how great Windows Azure is I’ll have the opportunity to make it even better.

My initial focus will be on the Developer and IT Pro client experience. This will mean a lot of work focusing on Cloud Services(PaaS) and Virtual Machines(IaaS) usability and functionality. The first order of business is building improving upon the command line and SDK experience for our compute options.

I’m very open and interested in feedback so please let me know how I can help 🙂

Michael Washam
mwasham at

How to provision a Linux VM in a VNET with PowerShell

Getting Started Links

# From Get-AzureVMImage 
$img = 'b39f27a8b8c64d52b05eac6a62ebad85__Ubuntu-12_04_1-LTS-amd64-server-20121218-en-us-30GB'
$user = 'somelinuxuser'
$pass = 'somelinuxpwd!'
$vnet = 'HybridVNET'
$ag = 'WestUSAG'
$svcname = 'mylinuxsvc1' 

New-AzureVMConfig -Name 'linuxfromps1' -ImageName $img -InstanceSize Small |
	Add-AzureProvisioningConfig -Linux -LinuxUser $user -Password $pass |
	Set-AzureSubnet -SubnetNames 'AppSubnet' | # Optional 
	New-AzureVM -ServiceName $svcname -VNetName $vnet -AffinityGroup $ag

Upgrading Windows Azure Cloud Services to Server 2012 and .NET 4.5

In this post I’ll walk through the options for upgrading an existing Cloud Service using OS Family (1 or 2) to use Windows Server 2012 (OS Family 3) and .NET 4.5.

Traditionally, to upgrade a Cloud Service all that is required is to click the update button in the Windows Azure portal after uploading your updated package or publishing with Visual Studio. However, when changing OS Families from (1 or 2) to 3 you will receive the error saying “Upgrade from OS family 1 to OS family 3 is not allowed”. This is a temporary restriction on the update policy that we are working to remove in an upcoming release.

In the mean time there are two workarounds to updating your existing Windows Azure Cloud Service to run with .NET 4.5 and OS family 3 (Server 2012):

  • VIP Swap (recommended approach)
  • Delete and Re-deploy

Both have different pros/cons that are outlined in the table below. Detailed walkthroughs of both the options are provided below:

VIP Swap Fast and simple. Usual VIP swap restrictions apply.
Delete/Re-deploy Can make any change to the updated application. VIP swap restrictions do not apply. Loss of availability while the application is deleted and then redeployed. Potential change in the public IP address after the   redeployment.

Configuring Your Project for Upgrade

Step 1: Open the solution in Visual Studio.

In the example below the solution is named Sdk1dot7 and there are two projects. The first is an MVC Web Role project (MvcWebRole1) and the second is the Windows Azure Cloud Service project (Sdk1dot7).


Step 2: Upgrade the Project

Right click on the Cloud Service project  (Sdk1dot7) and select properties. Note in the picture below, the properties page shows that this project was built the June 2012 SP1 Windows Azure Tools version. Click the “Upgrade” button. After the upgrade, if you check this properties sheet again, it should show that the current Windows Azure Tools
version is October 2012.


Step 3: Open ServiceConfiguration.Cloud.cscfg: csu3

Change OSFamily from (1 or 2) to 3.


New Value: osFamily=”3″

Step 4: Modify the Web Role to use .NET 4.5

Open the properties sheet of the WebRole project by right-clicking the WebRole project and clicking on “Properties”.


In the “Application” tab look for the “Target framework” dropdown. It should show .NET 4.0.


Open the dropdown and select .NET 4.5. You’ll get a “Target Framework Change” dialog box, click “Yes” to proceed. The Target Framework should now read .NET 4.5.

Rebuild by hitting ‘F6’. You might get some build errors due to namespace clashes between some new libraries that have been introduced in .NET 4.5. These are easy enough to fix. If you cannot, feel free to add the comment and I’ll respond.

Deploying using VIP Swap

You can deploy from within Visual Studio or from the Windows Azure Portal. In this post, I’ll show the steps to deploy through the portal.

Step 1: Generate the .cspkg and .cscfg files for upload.

Right click on your Cloud Service project (Sdk1Dot7) and select Package:


After the packaging is complete, a file explorer window will open with the newly created .cspkg and .cscfg files for your Cloud Service.

Step 2: Uploading the Files to the Staging Slot using the Windows Azure Portal

Open the Windows Azure portal at and select your cloud service. Click on the “Staging” tab (circled in red in the accompanying picture below).

Once on the staging tab, click on the “Update” button on the bottom panel (circled in green in the accompanying picture below).


From there a dialog will open requesting the newly created files packaged from Visual Studio.

Select “From Local” button for both and upload  the files that were generated during packaging. Remember to check the “Update even if one or more roles contain single instance” if you have a single instance role. These options are circled in red in the picture below. Click on the check marked circle to proceed.


Step 3: Test the new deployment

At this point you will have your application running on OS family 3 and using .NET 4.5 in the staging slot and your original application using OS family 1/2 and .NET 4.0 in the production slot. Browse to the application by clicking the Site URL on the dashboard under the staging slot.

Step 4: Perform the VIP Swap to Production

On either the production or the staging tab, click on the “swap” button located in the bottom panel next to the update button (circled in green in the accompanying picture).


After this operation completes, you will have your application running on OS family 3 and using .NET 4.5 in place of the original application.

Deleting Your Deployment

The second option is to delete your deployment. This is not going to be the recommended approach for a production application because you will have downtime and there is a probability of losing the current IP address assigned to your VIP. This option is really only useful for dev/test where you do not want to go through the VIP swap life cycle or you are making changes to the cloud service that are restricted during an in-place upgrade using VIP swaps.

To delete your deployment open the Windows Azure portal at and select your cloud service. Click the “STOP” button in the bottom panel (circled in green in the accompanying picture). Click “yes” on the dialog box that pops up.


Once the service is deleted you can simply republish from Visual Studio or package and upload using Visual Studio + the management portal.